Reverse / leakage detection classes

Scared provides the following ready to use leakage detection classes.

class scared.analysis._analysis.CPAReverse(selection_function, model, precision='float32')[source]

Correlation Power Analysis using Pearson coefficients mixin.

ex

traces accumulator with shape (trace_size,)

Type:

numpy.ndarray

ex2

squared traces accumulator with shape (trace_size,)

Type:

numpy.ndarray

ey

intermediate data accumulator with shape (data_words,)

Type:

numpy.ndarray

ey2

squared intermediate data accumulator with shape (data_words,)

Type:

numpy.ndarray

exy

dot product (intermediate data, traces) accumulator with shape (data_words, trace_size)

Type:

numpy.ndarray

Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

class scared.analysis._analysis.DPAReverse(*args, **kwargs)[source]

Differential Power Analysis mixin.

accumulator_traces

accumulator of all traces with shape (trace_size,)

Type:

numpy.ndarray

accumulator_ones

accumulator of traces corresponding to intermediate value 1, with shape (trace_size,)

Type:

numpy.ndarray

processed_ones

number of processed traces in accumulator_ones for each data word considered, with shape (data_words,)

Type:

numpy.ndarray

Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

__init__(*args, **kwargs)[source]

Initialize analysis.

Parameters:
  • selection_function (SelectionFunction) – selection function to compute intermediate values. Must inherit from SelectionFunction.

  • model (Model) – model instance to compute leakage intermediate values. Must inherit from Model.

  • precision (numpy.dtype, default=`float32`) – precision which will be used for computations.

class scared.analysis._analysis.ANOVAReverse(partitions=None, *args, **kwargs)[source]

This standalone partitioned distinguisher applies the ANOVA F-test metric.Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

class scared.analysis._analysis.NICVReverse(partitions=None, *args, **kwargs)[source]

This standalone partitioned distinguisher applies the NICV (Normalized Inter-Class Variance) metric.Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

class scared.analysis._analysis.SNRReverse(partitions=None, *args, **kwargs)[source]

This standalone partitioned distinguisher applies the SNR (Signal to Noise Ratio) metric.Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

class scared.analysis._analysis.MIAReverse(bins_number=128, bin_edges=None, *args, **kwargs)[source]

This partitioned distinguisher mixin applies a mutual information computation.Base class for all reverse analysis processing objects.

This class must be subclassed and combined with a mixin inheriting from DistinguisherMixin. It provides the common processing method for a side-channel statistical analysis.

The base class:
  • initialize the state before processing any traces

  • provides method, either to process traces and compute results manually, either to run a complete processing of a Container instance

  • manage results attributes: distinguisher method output (results)

results

array containing the latest results obtained from the distinguisher computing.

Type:

numpy.ndarray

__init__(bins_number=128, bin_edges=None, *args, **kwargs)[source]

Initialize analysis.

Parameters:
  • selection_function (SelectionFunction) – selection function to compute intermediate values. Must inherit from SelectionFunction.

  • model (Model) – model instance to compute leakage intermediate values. Must inherit from Model.

  • precision (numpy.dtype, default=`float32`) – precision which will be used for computations.

class scared.ttest.TTestContainer(ths_1, ths_2, frame=None, preprocesses=[])[source]

Wrapper container for trace header sets dedicated to TTest analysis.

Parameters:
  • ths_1 (TraceHeaderSet) – the two trace header set to use for the TTest.

  • ths_2 (TraceHeaderSet) – the two trace header set to use for the TTest.

containers

list of two Container.

Type:

list

__init__(ths_1, ths_2, frame=None, preprocesses=[])[source]
class scared.ttest.TTestAnalysis(precision='float32')[source]

TTest analysis class.

It provides the processing of a TTest on two trace header sets. Leakage detection using a t-test on two trace sets (e.g. fix vs random). It is able to detect any first order leakages, without knowledge of leakage function or model.

It provides an API similar to :class:Analysis`, but simpler.

accumulators

list containing two instances of TTestAccumulator.

Type:

list

result

array containing the result of the t-test with the current state.

Type:

numpy.ndarray

Examples

Create your analysis object and run it on a t-test container:

container = scared.TTestContainer(ths_1, ths_2) ttest = scared.TTestAnalysis() ttest.run(container) ttest.result

__init__(precision='float32')[source]

Initialize t-test object.

Parameters:

precision (numpy.dtype, default=`float32`) – precision which will be used for computations.

run(ttest_container)[source]

Process traces wrapped by ttest_container and compute the result.

Starting from the current state of this instance, the ttest containers are processed by batch.

Parameters:

ttest_container (TTestContainer) – a TTestContainer instance wrapping the trace header sets.

class scared.ttest.TTestThreadAccumulator(precision)[source]

Accumulator class used for t-test analysis.

It is a threaded accumulator that will process a complete container. Allows multiple accumulations to be launched in parallel.

Parameters:

precision (np.dtype or str) – Data precision (dtype) to use.

processed_traces

number of traces processed

Type:

int

sum

array containing sum of traces along first axis

Type:

numpy.ndarray

sum_squared

array containing sum of squared traces along first axis

Type:

numpy.ndarray

mean

array the mean of traces

Type:

numpy.ndarray

var

array containing variance of traces

Type:

numpy.ndarray

run()[source]

launch the accumulation on the given Container, in the main thread.

start()[source]

launch the accumulation on the given Container, in a separated thread.

join()[source]

wait end of thread processing and check for exception. Reraise if any.

compute()[source]

computes and stores the values of mean and var for the current values accumulated.

update(traces)[source]

given a traces array, update the sum and sum_squared attributes.

__init__(precision)[source]

Note: No tests are performed on inputs, delegated to parent TTest.

update(traces)[source]
stop()[source]

Inform the thread to stop at next batch processing.

run(container=None)[source]

Launch the accumulation on the given Container, in the main thread.

start(container)[source]

Launch the accumulation on the given Container, in a separated thread.

join()[source]

Wait end of thread processing and check for exception. Reraise if any.

compute()[source]
exception scared.ttest.TTestError[source]