Programming/Deep Learning

From HPC
Revision as of 12:10, 21 November 2018 by Pysdlb (talk | contribs)

Jump to: navigation , search

Deep Learning

Introduction

Deep learning (also known as deep structured learning or hierarchical learning) is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, semi-supervised or unsupervised.

There is a massive amount of possible applications where Deep Learning can be deployed, these include:

  • Automatic speech recognition
  • Image recognition
  • Visual art processing
  • Natural language processing
  • Drug discovery and toxicology
  • Customer relationship management
  • Recommendation systems
  • Bioinformatics
  • Health diagnostics
  • Image restoration
  • Financial fraud detection


There are 6 Types of Artificial Neural Networks Currently Being:

  • Recurrent Neural Network(RNN) – Long Short Term Memory
  • Convolutional Neural Network
  • Feedforward Neural Network – Artificial Neuron
  • Radial basis function Neural Network
  • Kohonen Self Organizing Neural Network
  • Modular Neural Network

The top two are the most used.


Development Environments

There are the following development environments already part of our HPC

  • Python 3.5 with Tensorflow (and Keras), and theano.
  • C/C++/Fortran with CUDA GPU programming.
  • PGI compiler with openACC programming for C and Fortran.
  • Matlab with deep learning libraries.


Deep Learning example

This represents the "Hello World" of the deep learning programs, and the mnist data set represents a large amount of hand written numbers (28 x 28 pixels) with labels. This program trains the neural network as training data and then uses that neural network to test data against the data set.

#!/usr/bin/env python

import tensorflow as tf
mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(),
  tf.keras.layers.Dense(512, activation=tf.nn.relu),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Dense(10, activation=tf.nn.softmax)
])
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(x_train, y_train, epochs=5)
model.evaluate(x_test, y_test)


Further Information


|style="width:5%; border-width: 0" | Icon home.png |style="width:95%; border-width: 0" |

|- |}