DEV Community

shunnNet
shunnNet

Posted on

✨ Co - A frontend development AI which create and write the content of the referenced files.

Introduction

Co is a frontend development AI writing assistant, similar to Copilot but with a different working approach, will automatically create and write the content of the referenced files in the code.

Features

  • ✨ Auto write/rewrite when you import a module (with special comments)
  • 🌴 Generate a project by markdown
  • 🛠️ TDD: auto generate implementation when writing test cases.

Demo

Using OpenAI gpt-4o-mini

Click image to see the demo.

Vue - Write a simple page

Vue demo

Vue - Write autocomplete input

AutoComplete demo

Markdown: Generate a project

Markdown demo

TDD

TDD demo

Quick start

Install the package.

npm install @imaginary-ai/core
Enter fullscreen mode Exit fullscreen mode

Create co.config.{js|ts} in current working directory.

// co.config.ts
import { OpenAITextGenerator, defineCoConfig } from '@imaginary-ai/core'

export default defineCoConfig({
  baseDir: '.',
  includes: ['**/*.js', '**/*.ts', '**/*.md'],
  excludes: ['**/node_modules/**', '**/.vscode', '**/.git/**', '**/ai.css'],
  generator: new OpenAITextGenerator({
    apiKey: '...', // Put your OPENAI_API_KEY here
    model: 'gpt-4o-mini',
    temperature: 0,
  }),
})
Enter fullscreen mode Exit fullscreen mode

Write some code import a function:

// @co:
import { sayHello } from "./path-to-file.js"
// @co-end

sayHello()
Enter fullscreen mode Exit fullscreen mode

Go to terminal run npx co run, then path-to-file.js will be generated.

// path-to-file.js
export function sayHello() {
    console.log("Hello, World!");
}
Enter fullscreen mode Exit fullscreen mode

Usage

Check documentation for more usages.

Why Co ?

Small Issues with Copilot

I believe many people have used GitHub Copilot or similar code assistants. Personally, I use GitHub Copilot.

The feature I use most frequently in GitHub Copilot is autocomplete. While editing a file, the model predicts the next content and provides suggestions.

This truly deserves the name Copilot—it always lends a helping hand just when you’re feeling exhausted from coding. I often find myself genuinely grateful for it in those moments. At this point, I’ve gotten so used to autocomplete suggestions that it feels strange when they don’t appear.

Within a single file, it often does a good job predicting what I need. However, things get a bit awkward when working across multiple files.

For example, while writing File A, I realize I need a File B containing some functions to complete my work in File A.

With GitHub Copilot:

  1. It's best to have two tabs open simultaneously—one for File A and one for File B—to provide the model with the correct context (which might mean closing all other tabs first).
  2. Switch to File B and write the functions I need. (If lucky, Copilot might guess what I want and autocomplete them for me.)
  3. I return to File A and import the functions from File B.

If your Copilot behaves like mine, it’s probably most helpful in step 3, when it knows which functions you are trying to use. When I am editing File B, it is harder for Copilot to infer what I want to write based on File A.

Additionally, this disrupts my work in File A. Can it be made even smoother?

How Co Is Different

I imagined a different kind of Copilot: one that doesn't just autocomplete within a single file but also understands your intent and writes the files you need in your workspace—almost like having two engineers working on the same project simultaneously.

That’s what Co does. With Co, the workflow becomes:

  1. In File A, reference the function I want from File B.
  2. Save the file.
  3. If necessary, modify File B.

Co will automatically generate the module you need and create an appropriate interface based on how you use the function in File A.

The advantage of this approach is that you don’t need to describe what you want in natural language. Describing a feature in natural language can be more cumbersome than just writing a piece of code to demonstrate it—LLMs seem to work the same way. Using natural language forces us to translate abstract code concepts into descriptive sentences, which every software engineer knows can be frustrating.

This method may aligns more closely with how LLMs excel at autocomplete.

Of course, this isn’t meant to replace Copilot. Copilot's autocomplete is great, and its ability to generate documentation or refactor code is also useful. Co simply complements Copilot by addressing one of its limitations (in my opinion).

How Does Co Work?

From an AI perspective, it uses a simple prompt and makes a single request—no agent-based techniques are involved.

We have "source files" reference a file not been written, I need you write the "referenced file" contents which fulfill the usage requirements in other source files. You must only return file content without any word.

${sourceFileContents}

---referenced file---
filename: ${this.path}
content:

Enter fullscreen mode Exit fullscreen mode

Use Cases and Potential of Co

Prototyping

Co is great for quickly generating module interfaces while keeping you in your workflow. Of course, the generated code won’t be perfect—it will still require some manual refinement. An ideal workflow might look like this:

> Edit Module A  
> Import module B in A and use it as if it’s already implemented.  
> Complete Module A.  
> Edit Module B.  
> Import module C in B and use it as if it’s already implemented.  
> Complete Module B.  
> Edit Module C.  
> ....  
Enter fullscreen mode Exit fullscreen mode

TDD (Test-Driven Development)

This workflow is actually quite similar to TDD: define how something should be used before implementing it.

With Co, you can generate interface modules while writing test cases. When combined with Copilot, this approach can be even more powerful.

Top comments (0)