Difference between revisions of "Programming/Deep Learning"
From HPC
m |
m |
||
Line 20: | Line 20: | ||
* Image restoration | * Image restoration | ||
* Financial fraud detection | * Financial fraud detection | ||
+ | |||
===Development Environments=== | ===Development Environments=== | ||
Line 29: | Line 30: | ||
* PGI compiler with openACC programming for C and Fortran. | * PGI compiler with openACC programming for C and Fortran. | ||
* Matlab with deep learning libraries. | * Matlab with deep learning libraries. | ||
+ | |||
===Deep Learning example=== | ===Deep Learning example=== |
Revision as of 12:03, 21 November 2018
Contents
Deep Learning
Introduction
Deep learning (also known as deep structured learning or hierarchical learning) is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, semi-supervised or unsupervised.
There is a massive amount of possible applications where Deep Learning can be deployed, these include:
- Automatic speech recognition
- Image recognition
- Visual art processing
- Natural language processing
- Drug discovery and toxicology
- Customer relationship management
- Recommendation systems
- Bioinformatics
- Health diagnostics
- Image restoration
- Financial fraud detection
Development Environments
There are the following development environments already part of our HPC
- Python 3.5 with Tensorflow (and Keras), and theano.
- C/C++/Fortran with CUDA GPU programming.
- PGI compiler with openACC programming for C and Fortran.
- Matlab with deep learning libraries.
Deep Learning example
This represents the "Hello World" of the deep learning programs, and the mnist data set represents a large amount of hand written numbers (28 x 28 pixels) with labels. This program trains the neural network as training data and then uses that neural network to test data against the data set.
#!/usr/bin/env python import tensorflow as tf mnist = tf.keras.datasets.mnist (x_train, y_train),(x_test, y_test) = mnist.load_data() x_train, x_test = x_train / 255.0, x_test / 255.0 model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation=tf.nn.relu), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10, activation=tf.nn.softmax) ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) model.fit(x_train, y_train, epochs=5) model.evaluate(x_test, y_test)