Enabling energy-aware and lossy-aware data compression in wireless

Views:
 
     
 

Presentation Description

Enabling energy-aware and lossy-aware data compression in wireless sensor networks by multi-objective evolutionary optimization

Comments

Presentation Transcript

Enabling energy-aware and lossy-aware data compression in wireless sensor networks by multi-objective evolutionary optimization:

Enabling energy-aware and lossy-aware data compression in wireless sensor networks by multi-objective evolutionary optimization Authors: Francesco Marcelloni Massimo Vecchio Presented by: Iraj Hedayati ([email protected]) QIAU – Data Compression Dr. M. Shahram Moien 1

Outline:

Outline Wireless sensor networks principals Why Data Compression? De-noising techniques Quantizing: DPCM MOEA natures NSGA-II Compression Method 2

Node structure:

Node structure 3

Wireless Sensor Networks:

Wireless Sensor Networks 4

Data propagation:

Data propagation 5

Why Compression?:

Why Compression? Using Battery Power Restriction Radio is Most power consumer Using Data Compression Low Memory Low Processor New Compression Approach 6

De-Noising Technique:

De-Noising Technique 1. Online: Sensor-Side Benefits: Energy-aware Low-Complexity Problems High memory 2. Offline: Base-Side Benefit: Capability of using lossless compression Problem: we can’t send Real-Time Errors Entropy Compression Power Consumption 7

Differential Pulse Code Modulation (DPCM):

Differential Pulse Code Modulation (DPCM) Regular Differences between samples Using differences Differential Pulse Code Modulation (DPCM) 8

Differential Pulse Code Modulation (DPCM) [cont.]:

Differential Pulse Code Modulation (DPCM) [cont.] s 0 is written on the compressed stream in raw format Scalar quantization lossy compression :quantization error Cells Levels Dead zone 9

Multi-Objective Evolutionary Algorithms (MOEA):

Multi-Objective Evolutionary Algorithms (MOEA) Pareto Dominance (x, u) & (y, v) x DOMINATES y Pareto Optimal : it is not dominated by any other Pareto Front : a set of Pareto Optimal solution (GOAL) I the number of criteria u i and v i ate the i th elements of vectors u and v 10

Multi-Objective Evolutionary Algorithms (MOEA) [cont.]:

Multi-Objective Evolutionary Algorithms (MOEA) [cont.] Most popular MOEAs Strength Pareto Evolutionary Algorithms (SPEA) Evolution (SPEA2) Niched Pareto Genetic Algorithm (NPGA) Pareto Archive Evolution Strategy (PAES) Non-dominated Sorting Genetic Algorithm (NSGA) Evolution (NSGA-II) 11

Compression block Diagram:

Compression block Diagram 12

Compression block diagram:

Compression block diagram 13

Codeword Assignment:

Codeword Assignment Each cell in domain maps to a quantization index Quantization indexes have unequal probabilities Shorter bits to most probable index Entropy encoding – Huffman coding 14

Quantization Operator:

Quantization Operator 15

Chromosome coding:

Chromosome coding Each chromosome codifies a different quantizer Chromosome parameters Width of the Dead Zone (DW) Width of the cell in first sub region (FW) Width of the cell in first sub region (FN) Width of the cell in second sub region (SW) Width of the cell in second sub region (SN) 0 DW DW+ FW*FN DW+ FW*FN+ SW*SN -DW -DW- FW*FN -DW - FW*FN- SW*SN 16

NSGA-II operation:

NSGA-II operation START P0=100 from N pop Compute n p that dominate p Set S p dominated by p All p? Rank all individuals Generated Q t by genetic operations Rank P ext = P t +Q t 17

Design a lossy algorithm:

Design a lossy algorithm Start Collect set of samples Select a de-noising algorithm Apply this algorithm to set of samples to generate a training set Apply MOEA to determine parameters Select one solution among the set of Pareto Front Configure the lossy compression algorithm and transfer code to nodes 18

Experimental results:

Experimental results Deployment name Node ID Symbolic name Number of samples Time interval From To FishNet 101 FN101 12651 09/08/2007 31/08/2007 Gr.St Bernard 10 GSB10 23813 15/09/2007 18/10/2007 Le Gènèpi 20 LG20 21523 04/09/2007 03/10/2007 19

Experimental results [cont.]:

Experimental results [cont.] A B C DZ 8 32 63 FW 1 15 61 FN 3 1 1 SW 2 5 44 SN 1 1 1 20

Codewords:

Codewords Index Probability ( A ; B ; C) Codeword ( A ; B ; C) _ 4 [0.1198, –, –] [101, –,–] _ 3 [0.0163,–,–] [100011,–,–] _ 2 [0.0200,0.0052,0] [100001, 1011, 1111] _ 1 [0.0252,0.0311,0.0131] [10010,11,110] 0 [0.6309,0.9258, 0.9730] [0,0,0] 1 [0.0023, 0.0311, 0.0139] [10011,100,10] 2 [0.0186,0.0069, 0] [100010, 1010, 1110] 3 [0.0206, –,–] [100000,–, –] 4 [0.1258, –, –] [11, –,–] 21

Results:

Results 22

LTC:

LTC 23

Sim-It Arm:

Sim -It Arm 24

Energy:

Energy 25

Thank you:

Thank you Any question? 26

authorStream Live Help