NeurophStudio Changelog

What's new in NeurophStudio 2.96

May 2, 2019
  • Neuroph 2.96 comes with various API improvements, several new features, more examples for standard introductory machine learning tasks, and initial implementation of Java Visual Recognition API JSR381.
  • API improvements are related to normalization methods, data set spliting/sampling, HE weight initialisation and kfold crossvalidation.
  • There are also inital implementations of new neural network architectures like implementation of ART and LSTM.
  • Samples project provides package org.neuroph.samples.standard10ml with examples of 10 standard machine learning problems which are suitable for teaching neural networks and machine learning basics with Java.
  • Also there are several contributions in Contrib project related to hyper parameter search and, backpropagation benchmarking, and cross entropy loss function which are still not fully tested and integrated.
  • This release also includes initial implementation of parts of Visual Recognition API, which is being developed as official Java technology Standard for Visual Recognition tasks within JCP (Java Community Process).
  • GitHub repository has also been reorganized so the framework and NeurophStudio GUI are now separate repositories.
  • https://github.com/neuroph/NeurophFramework
  • https://github.com/neuroph/NeurophStudio
  • With develpoment of a number of advanced deep learning framework with support for GPU and distributed computing, such as Tensorflow, PyTorch, Deeplearning4j and many others, the main purporse Of Neuroph is to be an entry level and educational software to introduce Java developers to field of neural networks and deep learning.
  • At some universities it is being used as first step to learn basic concepts and principles before moving to more advanced frameworks,
  • thanks to interactive examples, visualizations, readable source code and user friendly IDE-like environment.
  • We're working on Neural Newtorks ExPLODE teaching methodology (EXploratory Programming and Learning Open Developemnt Environment) which will provide set of tools, tutorials and teaching methodology that will be the fastest way for Java developers to learn neural networks/machine learning.
  • For existing Neuroph users who might need more advanced features and additional support we recommend to take a look at Deep Netts Platform https://deepnetts.com
  • Deep Netts Platform is built on same philosophy as Neuroph to create intuitive easy-to-use neural network/deep learning environment for software developers, and more advanced features and implementation. The future releases of Neuroph will include deep netts community edition, in order to
  • provide better support for convolutional networks for image recognition.
  • And at the end, we want to announce the next release soon that will be based on Apache NetBeans 11, include support for Deep Netts Community Edition, and generic standard Java machine learning and visual recognition API which is being developed within JSR 381.

New in NeurophStudio 2.7 Build 201210100934 (Jul 10, 2013)

  • Main changes include:
  • Performance improvements about 30% thanks to use of plain arrays instead of ArrayList collection for storing layers, neurons and connections
  • Listing of supported input functions, transfer functions, learning rues, neuron and layer classes using API - through Neuroph class
  • New class DataSet which replace old TrainingSet class (beside naming change which now prevents confusion with Test and Validation sets, there are some implementation changes and same class is used for storing supervised and unsupervised data sets)
  • Greatly improved visual editor in Neuroph Studio with full component palette - everything you have on framework level now you can simply drag n' drop
  • Weights histogram in Neuroph Studio
  • New real time error graph based on real time graph drawing from Visual VM
  • Improved Stock Market Prediction Wizard based on Time Series prediction
  • Normalize and shuffle data sets using right click options
  • Bug fix for black n' white image recognition
  • Fixed OCR wizards for text and handwritten char recognition
  • Project source tree is now splitted into several subprojects
  • We did this because project source tree started to grow, and it also included various unrelated application specific packages (for example someone who needs image recognition is probably not interested in stock rediction etc.)
  • Neuroph project now consists of the following subprojects:
  • Core - Neuroph framework core/essential basic classes
  • ImageRec - image recognition support
  • OCR - OCR support
  • Samples - Various examples
  • Contrib - Misc contributions and stuff under development

New in NeurophStudio 2.69 (Sep 28, 2012)

  • Advanced image recognition wizard (in Neuroph Studio)
  • Data normalization
  • Weights randomization techniques
  • IO Adapters (for file, url, database, stream)
  • Micro benchmarking framework
  • Resilient propagation

New in NeurophStudio 2.4 (Jun 8, 2010)

  • Perceptron and MultiLayerPerceptron now have Linear transfer functions in input layer (looks like this improved learning, and thats right to do)
  • Changed the way ThresholdNeuron calculates output - it used to compare total input with thershold, now it does substraction totalInput-thresh.
  • Since it has Step transfer function on output it makes no difference on final result, but it has better model for visualisation.
  • Training monitor is now displayed as internal frame so it does not hide behind the main frame.
  • New icons for toolbar buttons
  • Created start.bat for easyneurons
  • Default initial setting of max error 0.01 for all supervised learning rules (many users forget to set this setting when training from code)
  • Added load(InputStream inputStream) method to NeuralNetwork class to enable the use of getResourceAsStream to load neural network from jar.
  • Added BiasNeuron class, which provides bias feature for MLPs and other networksBias neuron allways has high output level, and dont has inputs.
  • Added bias neuron in MultiLayerPerceptrons
  • Option to create direct conenctions from input to output layer.
  • User can choose which learning rule want to use for MLP from GUI: Basic backpropagation, Backpropagation with momentum or new Dynamic backpropagation which provides new learning features.
  • Total network error formula fixed (again): total_error = (1/2n) sum(e)^2 Now we multiply with 1/2n, and before it was just 1/nThe original formula use 1/2n
  • Pause learning feature - user can pause learning threads from gui and code.
  • Created PerceptronLearning rule which is LMS based learning rule for perceptrons (but its not the same as BinaryDeltaRule)
  • Added hasReachedStopCondition() to SupervisedLearning class so we can override and create custom stoping condition in derived classes if needed.
  • Added new stopping condition 'min error change' to SupervisedLearning so we can specify that we want to stop learning if error change get to small for some number of iterations (when it gets stuck in local min)
  • Added doOneLearningIteration method to IterativeLearning which allow to perform step by step learning
  • Aded DynamicBackpropagation which can use dynamic learning rate and momentumIts possible to specify min and max values, and change rateIf the total network error is decreasing both parameters are increased in order to speed up error lowering.
  • When the error is increasing both values are decreased to minimize the error growth.
  • Improved thread sync for error graph, the training is faster and drawing smoother.
  • Added initializeWeights() methods to NeuralNetwork class to provide a way to initialise the network with the same weights every time.
  • Neuron and its components creation using reflection in NeuronFactory: provides powerful mechanism for creating/adding custom neurons and transfer functions.
  • Added Properties and modified NeuronProperties class (util package), so now it accepts neuron specification in athe form of (key, value) collection
  • where values are Class instances.
  • Perceptron and Backpropagation samples in easyNeurons which provide learning visualization.
  • Neuron properties are now displayed in Internal farme so it does not hide behind the main frame when use clicks somewhere elseIt also shows neurons class.
  • Added mechanism for defining constraints and validation on size for input and output vectors in training set - Trainingset constructors can accept size for inputand/or output vector,
  • and each training element is checked before it is added to training set.
  • Fixed bug when importing training set if there was and empty line in file (exception was thrown)
  • Stock Market Prediction samples
  • OCR tools and API (handwriting and text recognition)

New in NeurophStudio 2.3.1 (Jan 23, 2010)

  • Several bugfixes for version 2.3
  • fixed issue with editing gui in NetBeans (fixed NetBeans project file)
  • fixed LMS formula
  • fixed testing in black and white mode for image recognition
  • fixed gui bug - exceptions when creating large networks
  • Changed image recognition API, so the color mode is automaticaly detected from settings used for network training - removed unnecessary methods
  • Graph view - migrated graph view to JUNG to 2.0, created specific network layouts and removed unnecessary options
  • ANT build file is now included in release which can build the jars for library and GUI.

New in NeurophStudio 2.3 (Jan 23, 2010)

  • Image recognition support (GUI tool and library)
  • See easyNeurons, Main menu > Tools > Image Recogniton, (a brief HOWTO is available in help)
  • See org.neuroph.contrib.imgrec for classes for image recognition support in your applications (javadoc also available)
  • API improvments - See API_CHANGES.txt
  • Some basic samples in org.neuroph.samples
  • Improved javadoc documentation
  • Some UML diagrams - see doc/uml

New in NeurophStudio 2.2 (Jul 4, 2009)

  • This release brings the following features:
  • momentum for backpropagation (makes the learning more than two times faster)
  • xml support for neural networks and training sets
  • import training set from files (txt, csv...)
  • network error graph for LMS based learning rules
  • Instar, Outstar, BAM neural networks and learning rules
  • basic help system
  • sample image recognition application

New in NeurophStudio 2.1.1 Beta (Apr 11, 2009)

  • kohonen network sample had bug with drawing
  • fixed - backpropagation with tanh was throwing exception
  • fixed - basic neuron sample
  • unsupervised hebbian learning
  • oja learning rule
  • new GUI components JNeuron, JLayer and JNeuralNetwork