π Major Changes
Fixed Parameter Parsing
Layer parameter handling has been significantly improved:
# Now correctly handles both styles:
Dense(64, "relu") # Positional params
Dense(units=64, activation="relu") # Named params
Validation Enhancements
- Strict positive integer validation for critical parameters
# These will now raise clear validation errors:
Conv2D(filters=-32) # ERROR: filters must be positive
Dense(units=0) # ERROR: units must be positive
Improved Error Messages
- Added line/column information for better debugging
ERROR at line 4, column 32: Conv2D filters must be positive integer, got -32
π οΈ Technical Improvements
Layer Parameter Processing
- Unified parameter merging across layers:
- Dense
- LSTM
- GRUCell
- GaussianNoise
Grammar Refinements
- Resolved token conflicts between:
- NUMBER
- FLOAT
- INT
- Simplified
param_style1
rules
HPO Support Updates
# Now correctly supports:
HPO(choice(32, 64, 128)) # Units choice
HPO(choice("relu", "tanh")) # Activation choice
π Bug Fixes
Layer-Specific Fixes
- Fixed nested list flattening in GaussianNoise
- Corrected STRING token regex for activation functions
- Resolved VisitError wrapping issues
Macro System
- Fixed parameter override logic during expansion
# Now correctly handles:
define MyBlock {
Dense(64)
Dropout(0.5)
}
MyBlock(units=128) # Properly overrides Dense units
π§ Known Issues
- PyTorch Support: Limited layer support (work in progress)
- Macro Stability: Potential parser issues with nested layer blocks
-
HPO Limitations:
log_range()
requires explicit integer casting
π Migration Guide
Updating from v0.2.1
# Old style (might fail):
network MyNet {
layers: Dense("64") # String number
}
# New style (recommended):
network MyNet {
layers: Dense(64) # Integer number
}
π Next Steps
- Complete PyTorch layer support
- Stabilize macro system
- Enhance HPO functionality
For full changelog, see CHANGELOG.md
For documentation, visit docs/
Top comments (0)