Difference between revisions of "AI: Anomaly Detection in logfiles"

From Embedded Lab Vienna for IoT & Security
Jump to navigation Jump to search
Line 15: Line 15:
=== Step 1 - Create a model ===
=== Step 1 - Create a model ===


First you need to create a sequential model, which can be trained later.
First we need to create a sequential model, which can be trained later.
  model = Sequential()
  model = Sequential()


Our model will use 1 input layer, 1 hidden layer with 128 nodes and 1 output layer with a single node.
The next step is to create an input layer consisting of 63 nodes, one for every feature we have in our dataset.
  model.add(Dense(63))
 
Next a hidden layer consisting of 128 nodes with the ReLU (Rectified Linear Unit) activation function.
model.add(Dense(128, Activation('relu')))
 
And finally the output layer consisting of 1 node which represents 'attack' or 'no attack'
model.add(Dense(1))
 
Now we could change the learning rate to a specific value, but we just leave it at the default 0.001
learning_rate = 0.001
For the optimizer we just use the Adam Optimizer with the pre-defined learning rate.
optimizer = tf.optimizers.Adam(learning_rate)


WIP
WIP

Revision as of 14:02, 13 July 2022

 ➤ IMPORTANT: This page is still under construction.

Summary

This guide will create a basic AI model to perform binary classification in order to detect anomalies in logfiles. This AI model is also suitable for the Jetson AGX Xavier Development Kit

Requirements

  • Packages: TensorFlow, Keras, Pandas, sklearn, numpy, seaborn, matplotlib
  • Software: Pycharm or any other python editor

Description

Step 1 - Create a model

First we need to create a sequential model, which can be trained later.

model = Sequential()

The next step is to create an input layer consisting of 63 nodes, one for every feature we have in our dataset.

 model.add(Dense(63))

Next a hidden layer consisting of 128 nodes with the ReLU (Rectified Linear Unit) activation function.

model.add(Dense(128, Activation('relu')))

And finally the output layer consisting of 1 node which represents 'attack' or 'no attack'

model.add(Dense(1))

Now we could change the learning rate to a specific value, but we just leave it at the default 0.001

learning_rate = 0.001

For the optimizer we just use the Adam Optimizer with the pre-defined learning rate.

optimizer = tf.optimizers.Adam(learning_rate)

WIP

Step 2

WIP


Used Hardware