DEV Community

NeuralLang
NeuralLang

Posted on

⚡ Simplifying NNs With Neural —First Code Generation Example: Simple MNIST Classifier!

NeuralDesign

As a developer passionate about machine learning, I don't want to write repetitive boilerplate code for neural networks. Whether it’s TensorFlow, PyTorch, or ONNX, the process of defining layers, compiling models, and setting up training can feel tedious.

Defining neural networks in raw TensorFlow/PyTorch can be verbose.

What if you could write models more intuitively and compile them seamlessly?

The neural DSL allows you to define models efficiently and convert them into executable TensorFlow/PyTorch code.

This is a basic feedforward neural network generated from neural, designed for classifying 28x28 images into 10 categories. It’s perfect for handwritten digit recognition (like MNIST), small-scale image tasks, teaching neural network basics, or as a quick baseline for multi-class problems.


 Neural Code

network MyModel {
    input: (None, 28, 28)
    layers:
        Dense(128, activation="relu")
        Dropout(rate=0.2)
        Output(units=10, activation="softmax")
    loss: "categorical_crossentropy"
    optimizer: "Adam"
}
Enter fullscreen mode Exit fullscreen mode

Tensorflow Code

import tensorflow as tf

model = tf.keras.Sequential(name='MyModel', layers=[
    tf.keras.layers.Flatten(input_shape=(None, 28, 28)),
    tf.keras.layers.Dense(units=128, activation='relu'),
    tf.keras.layers.Dropout(rate=0.2),
    tf.keras.layers.Dense(units=10, activation='softmax'),
])

model.compile(loss='categorical_crossentropy', optimizer='Adam')

Enter fullscreen mode Exit fullscreen mode

Compile to TensorFlow

neural compile example.neural
Enter fullscreen mode Exit fullscreen mode

Image description

In the example above, I define a neural network using Neural DSL and compile it into TensorFlow code. The resulting Python file is ready for training!

Try Neural DSL yourself! Here’s the repo: [https://github.com/Lemniscate-SHA-256/Neural].

🛠 What features would you like to see next? Drop your ideas in the comments!

Top comments (0)