Hi everyone! 👾 I’m excited to share "Neural," a project I’ve been working on to simplify neural network development. Neural is a domain-specific language (DSL) and debugger that lets you define, train, and debug models with ease—whether via code, CLI, or a no-code interface. 🎛
Why Neural?
Building neural networks can be complex—boilerplate code, shape mismatches, and debugging woes slow us down. Neural tackles this with:
-
A YAML-like DSL: Define models concisely (e.g.,
Conv2D(filters=32, kernel_size=(3,3))
). -
NeuralDbg: Real-time monitoring of gradients, execution traces, and resources, with a
--hacky
mode for security analysis. - Cross-Framework Support: Export to TensorFlow, PyTorch, or ONNX.
Example: MNIST Classifier
Here’s a quick .neural
file:
network MNISTClassifier {
input: (28, 28, 1)
layers:
Conv2D(filters=32, kernel_size=(3,3), activation="relu")
MaxPooling2D(pool_size=(2,2))
Flatten()
Dense(units=128, activation="relu")
Output(units=10, activation="softmax")
loss: "sparse_categorical_crossentropy"
optimizer: Adam
}
Compile And Run
pip install neural-dsl
neural compile mnist.neural --backend pytorch
neural run mnist_pytorch.py
Debug And Test
neural debug mnist.neural --hacky
Visit http://localhost:8050 to see gradients, resource usage, and potential security breaches.
 Neural-dsl is a WIP DSL and debugger, bugs exist, feedback is welcome!
What’s Next?
- I’m adding automatic hyperparameter optimization (HPO), research paper generation, and TensorBoard integration.
Try it out on GitHub and let me know what you think!
🦾 Share your feedback—I’d love to hear from the community!
Top comments (1)
This looks like a very useful library, I'm just wondering if you could link to the GitHub repo? Would love to take a look at it!