PySptools Changelog

What's new in PySptools 0.14.2 Beta

Nov 1, 2017
  • Update: the module pysptools.sklearn is rename pysptools.skl, to avoid name clash
  • Fix: The class classification.Output is compatible with matplotlib version 2.0.x and keep compatibility with matplotlib previous versions.
  • Update: The function skl._plot_feature_importances support new keywords: n_labels, height and sort.
  • New: scikit-learn ensemble estimators HyperAdaBoostClassifier, HyperBaggingClassifier and HyperExtraTreesClassifier added to the pysptools.skl module.
  • Improvements to the documentation.

New in PySptools 0.14.1 Beta (Mar 23, 2017)

  • Improvements to HyperEstimatorCrossVal, no more warnings.
  • Move: classification.KMeans moved to sklearn.KMeans, all stay the same.
  • Remove: classification.SVC, it is replaced with sklearn.HyperSVC.
  • New: sklearn.HyperGradientBoostingClassifier.
  • New: sklearn.hyper_scale function, call preprocessing.scale at a hypercube.
  • New: sklearn.shape_to_XY function, shape and concatenate hypercubes and cmaps to output X and y.
  • New: an example using HyperRandomForestClassifier, see example with HyperRandomForestClassifier.
  • New: a class method plot_feature_importances used by HyperRandomForestClassifier and HyperGradientBoostingClassifier.
  • Issue: the class classification.Output don’t seem to work with the latest Matplotlib version 2.0.0. Use previous version 1.5.3.

New in PySptools 0.13.2 Beta (Apr 14, 2015)

  • Some classes and functions are moved to the *util* module:
  • ROIs.
  • the *formating* module content: convert2d, convert3d, normalize.
  • The module *formating* is removed.
  • New: the input validation is completely reworked. The validation is now composed of a main class *InputValidation* that live in the *util* module and of secondary functions living in the differents inval.py files presents in some modules where the inputs are validated. These functions support the decorator pattern.
  • New: input validating decorator functions are added to many of the classes composing the library.
  • New: the class *AbundanceClassification* is added to the module *classification*; this class use an abundance maps array as input and output a classification map.
  • Fix: nfindr.pyx and the related files to compile the NFINDR cython version are with the distribution.
  • Documentation fix: for the ROIs class add method, the documentation string was:
  • rois: `dictionary list`
  • Each parameter, a dictionary, represent a rectangle or a polygon.
  • For a rectangle: {'rec': (upper_left_x, upper_left_y, lower_right_x, lower_right_y)}
  • For a polygone: {'poly': ((x1,y1),(x2,y2), ...)}, the polygon don't need to be close.
  • You can define one or more rectangle and/or polygon for a same cluster.
  • The polygon and the rectangle must be VALID.
  • But the meaning is:
  • rois: `dictionary list`
  • Each parameter, a dictionary, represent a rectangle or a polygon. They use matrix coordinates.
  • For a rectangle: {'rec': (upper_left_line, upper_left_column, lower_right_line, lower_right_column)}
  • For a polygone: {'poly': ((l1,c1),(l2,c2), ...)}, **l** stand for line and **c** for column. The polygon don't need to be close.
  • You can define one or more rectangle and/or polygon for a same cluster.
  • The polygon and the rectangle must be well formed.

New in PySptools 0.13.1 Beta (Mar 9, 2015)

  • This is a maintenance version, many fixes and improvements. All the display and plot methods was revised. This version may have an impact on your code.
  • The dependency between the ENVI header file and the NFINDR, PPI, ATGP and FIPPI plot and display methods is removed. You should use this format for the axes dictionary:
  • axes['wavelength'] : a wavelengths list (1D python list). If None or not specified the list is automaticaly numbered starting at 1.
  • axes['x'] : the x axis label, 'Wavelength' if None or not specified. axes['x'] is copied verbatim.
  • axes['y'] : the y axis label, 'Brightness' if None or not specified. axes['y'] is copied verbatim.
  • Fix: classes NFINDR, ATGP: internaly the mask have a wrong format resulting in ineffective masking.
  • Fix: classes NFINDR, PPI, ATGP and FIPPI: when calling the method get_idx(), the output coordinates have their axis inverted, e.g.: output is [(y1,x1),(y2,x2)...] and suppose to be [(x1,y1),(x2,y2)...].
  • Fix: functions mATGP and ATGP (Note: the classes NFINDR and ATGP call these functions): the last endmember returned by ATGP is always false (not a endmember), this problem is caused by an internal indexing error. The cython version is fixed accordingly.
  • Fix: util.display_linear_stretch(): xrange replaced by range to keep Python 3 compatibility.
  • Fix: spectro.SpectrumConvexHullQuotient(): normalize parameter was ineffective, fixed.
  • Fix: spectro.FeaturesConvexHullQuotient(): the normalize parameter was added to the __init__() method.
  • Major code refactoring of the class classification.Output.
  • Added: a mask and a labels parameters was added to the class classification.Output.
  • Added: a mask and an interpolation parameters to the plot() and display() methods for the Classes UCLS, NNLS, FCLS.
  • Added: stretch and colorMap parameters to the methods plot_single_map and display_single_map for the classes SAM, SID and NormXCorr.
  • Added: labels, mask and interpolation parameters to the plot and display methods for the classes SAM, SID and NormXCorr.
  • Class KMeans and class SVC use now the class classification.Output.
  • The dependencies between the classes ROIs and SVC are removed. The class ROIs can now plot itself and use at every place a parameter mask is present.
  • Updated: all nbex_pine_creek1 to nbex_pine_creek4.
  • Updated: example ex_smokestack_v3.py: the call to SAM is replaced by a call to NormXCorr.
  • Now the examples run on both Python 2.7 and 3.3 (see ex_methanol_burner_v2.py, ex_convexhull.py, ex_hematite_v2.py and ex_smokestack_v3.py).
  • A stretch mode is added to the method classification.(SAM,SID,NormXCorr).plot_single_map()

New in PySptools 0.13.0 Beta (Jan 20, 2015)

  • The class classification.SVM adapt the scikit-learn implementation of Support Vector Machines to the analysis of hyperspectral cubes.
  • An example of using the SVC class with the Pine Creek cube.

New in PySptools 0.12.2 Beta (Nov 15, 2014)

  • New:
  • Full Python 3.3 compatibility. However, there is a drawback, as the SPy software ENVI file reader is not ported to Python 3.3, instead, pysptools use a JSON version for the USGS spectral library. See the spectro module documentation.
  • The scikit-learn KMeans class is wrapped to make it user-friendly when applied to a HS cube. This new class live in the classification module.
  • A new Pine Creek example.

New in PySptools 0.12.1 Beta (Oct 17, 2014)

  • New:
  • This version adds compatibility to the IPython Notebook. A *display* method is introduced for many classes. This one use matplotlib and have the same role than the *plot* method. Calling *display* show figures embedded in the Notebook.
  • The source is ported to Pyhton 3.3. This porting is not integral. Because the SPy library run on Python

New in PySptools 0.12.0 Beta (Sep 3, 2014)

  • New:
  • A first version of the HySime hyperspectral signal subspace estimation.
  • SAM, SID and NormXCorr classifiers classes can now classify one spectrum at a time. Previously, they asked for two or more spectra to be classified.
  • A parameter *noise_whitening* was added to the HfcVd class. When set to *True* a whiten is applied before calling HfcVd and this way implement the NWHFC function.
  • Fix:
  • The variance calculation for HfcVd was wrong - it was integer based but it is assume to be float based - fixed.
  • The call to normalize() inside the count() method that belong to the HfcVd class was removed. Now, data conditioning needs to be done before calling HfcVd. It is simpler this way.

New in PySptools 0.11.0 Beta (Jul 5, 2014)

  • This version support a noise reduction module called 'noise'. The first version of this module is composed by:
  • Savitzky-Golay filter, a convolution based low-pass denoising algorithm, this algorithm can be applied on bands or on spectra;
  • withening;
  • MNF (maximum noise fraction), this is a first version.
  • To be compatible with the noise reduction module, ATGP and NFINDR needed some adjustments and they was updated accordingly:
  • the internal call to HfcVd is removed, the number of endmembers to extract is now mandatory;
  • a 'transform' parameter is added to NFINDR; it accept a transformed HSI cube; when a transformed cube is submited to NFINDR, the built-in PCA call is bypassed.

New in PySptools 0.10.1 Beta (May 19, 2014)

  • FCLS is fixed. To run this new FCLS you need to install CVXOPT. Note that you need CVXOPT only if your intent is to run FCLS. Christoph Gohlke support CVXOPT for Windows.

New in PySptools 0.10 Beta (May 15, 2014)

  • New:
  • For all the classifiers classes: a new 'constrained' parameter and a new 'get_single_map()' method.
  • For the unmixing classes NFINDR and ATGP: a new 'mask' parameter
  • A comprehensive exception mechanism
  • Unit tests
  • Two examples, one with a hematite drill core and a second with a smokestack
  • Reorganization:
  • The unmix module is renamed eea (for endmembers extraction algorithms)
  • The unmix module method 'unmix' is renamed 'extract'
  • Fixes:
  • classifiers.SAM, SID and NormXCorr: the current plot_single_map method accept 0 for the lib_idx parameter, the expected value is 1. – fixed
  • classifiers.NormXCorr: the colorbar is inverted and the threshold value is inverted from the expected behavior. – fixed

New in PySptools 0.09 Alpha (Apr 9, 2014)

  • The main improvement for this version is a threshold parameter added to the classifiers classes. The threshold can be single value or multiple values. In the later case, there is one individual threshold for each signal to match. Another sweet improvement to the classifiers classes is the capacity to plot one map by each signal to match with the threshold of your choice.
  • A new module receives all the classes and functions that work around the USGS library and the convex hull removal. This new module name is 'spectro'. This move leaves only one function in the sigproc module. A new class add get and search functionality to the USGS library.
  • This verson is under the Apache License, Version 2.0
  • ... older release notes are moved to the documentation.

New in PySptools 0.08 Alpha (Feb 4, 2014)

  • The two majors improvements are a substantial speed up for NFINR and an interface to the convex_hull_removal function.
  • In details:
  • NFINDR exist in two versions now. The pure python version have a speed gain of 4x against the version 0.07. The code is the same, the only difference is an almost direct call to lapack. Something that we can do with numpy and scipy. The second version is a cython rewrite without a call to lapack. It give a 8x speedup compared to the 0.07 version. This result seems abnormal, beating MKL blas/lapack is difficult but not impossible. It depends on the context, check the unmixing module documentation.
  • Speedup:
  • GLRT: 2x against the 0.07 version
  • New:
  • class interface to the convex_hull_removal function (see the sigproc module)
  • continuum based features extractor with a tetracorder style (see the sigproc module)
  • Distutils setup
  • Renamed:
  • files tests/ex_*.py renamed to test_*.py
  • Updated:
  • test_hull.py

New in PySptools 0.07 Alpha (Nov 22, 2013)

  • The more important improvements are a documentation, a NormXCorr classifier and a fix to NFINDR.
  • New:
  • documentation
  • OSP target detection
  • NormXCorr distance measure
  • chebyshev distance measure
  • NormXCorr classifier
  • corr stat function
  • cov stat function
  • Fixed:
  • NFINDR
  • NFINDR have a new feature. You can use ATGP to initialize the first endmembers set.
  • convert2d_signal_last is renamed to convert2D
  • the signal module is renamed to sigproc

New in PySptools 0.06 Alpha (Sep 6, 2013)

  • Many improvements to UCLS, NNLS, FCLS, ACE, MatchedFilter, HfcVd, NFINDR (all now in C-order). In overall this version do a better use of numpy
  • Speedup improvements to ACE, SAM and SID
  • New functions
  • hyperCem -> detextion.CEM
  • hyperGLRT -> detection.GLRT
  • eia.FIPPI -> unmixing.FIPPI
  • Added a get_idx method to the classes PPI, NFINDR, ATGP, FIPPI. This method return a index list corresponding to the induced endmembers
  • Added a legend to the plot function for the classes SAM and SID
  • Unlike the previous version, Cython was not use for this one. Exploring the power of numpy gave very good speedup to SID and SAM
  • Fix: NFINDR: np.zeros((p), dtype=np.int16) become IDX = np.zeros((p), dtype=np.long)
  • In conclusion, almost all the code was revisited and improved

New in PySptools 0.05 Alpha (Sep 6, 2013)

  • The library was reorganized in many modules: abundance_map, classification, detection, formatting, material_count, signal, tests, unmixing and util. The functions as been sorted into them and in some case a class interface to the function was added. See the tests examples, they are the only documentation for now.
  • Cython has been introduced. For this version, only the SAM classifier is compiled with cython.
  • Three distributions come with this version:
  • the source
  • one with SAM classifier compiled for Windows 7 32 bits and Python 2.7 and
  • one with SAM classifier compiled for Windows 7 64 bits and Python 2.7
  • You can use the source package, it fall back on the python version of the SAM classifier. To use, unzip the package and place the checkout directory on your PYTHONPATH.
  • ex_clsf.py is renamed to ex_clf.py
  • For the class interfaces, the format of the HSI data cube is (m x n x p), m and n are the spatial indices and p is the pixel. And for a library of signals the format is (N x p), N is the indice and p the pixel.

New in PySptools 0.04 Alpha (Jul 9, 2013)

  • Note: The SPy library use the C-order for the numpy representation of the data. The signal is in the innermost loop. That's the best representation for almost all algorithms that we can apply to this kind of data. In a first attemp I tried to simulate the Matlab way of doing for hyperspectral processing (usin a F-order shape). But for this version I am revising my position and started of doing things in a more pythonic approach. For the listed functions the signal is always last.
  • This give a gain of around 20% in speed improvement:
  • SAM, SID, SAM_classifier, SID_classifier
  • For the next versions, more functions will be migrated to the C-order, signal last.
  • Added from the Matlab Hyperspectral Toolbox: hyperMatchedFilter -> MatchedFilter (with a good optimisation from numpy) hyperAce -> ACE
  • A test program for the detections algorithms -> ex_detect.py
  • New functions, not from the current sources, but from articles: SID and SAM and helper functions SID_classifier and SAM_classifier, all in clsf.py
  • A test program for the classification algorithms -> ex_clsf.py

New in PySptools 0.03 Alpha (Jun 10, 2013)

  • Added from the Matlab Hyperspectral Toolbox: hyperNnls -> NNLS
  • From the Endmember Induction Algorithms toolbox (EIA): EIA_ATGP -> ATGP
  • Patch to convert3d(): it can now accept a 1D vector as input

New in PySptools 0.02 Alpha (Apr 22, 2013)

  • Added from the Matlab Hyperspectral Toolbox:
  • hyperConvexHullRemoval -> convex_hull_removal
  • hyperPpi -> PPI
  • From the Endmember Induction Algorithms toolbox (EIA):
  • EIA_NFINDR -> NFINDR
  • Added two new examples, one for the convex_hull_removal, ex_hull.py and one for NFIND, ex_eia.py
  • A lightweight data for the tests.
  • Many improvements to the tests programs.