We all love AI, and since recent years the boom in Artificial Intelligence has changed the world and is taking it into a new era. For any use probl...
For further actions, you may consider blocking this person and/or reporting abuse
hey, it would be great to see chat input-output in terminal for better interaction with LLM as well as how to manage new context in token window.
Just a suggestion, great read, saving for future ref. happy coding!
Actually a great idea, I will look into how I can code that! Thanks!
Yes. Connect if you need any help setting that up. happy to give a helping hand!
Sent a connection req on LInπΈ
Connected π€
Is it for typescript only?
What can a JS developer do?
As Claudio said, you'll have to change the import statements.
Here's one stack overflow question similar to your situation - stackoverflow.com/questions/783506...
learn TS π
You need to change the "dev" command, removing the "tsc -b && " and then changing the way the lib is imported.
Very interesting, seams easy, how resource intensive is it ? what kind of linux machine could host this?
The LLM model I am using is codellama with 7Billion parameters and my ancient gtx 1650 can handle it which has only 3.2GB of VRAM memory allocated to it. And for linux distro, any which can handle the gpu tasks would do i guess.
Thanks for information
Glad I could help!