You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18-22Lines changed: 18 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,17 +1,21 @@
1
1
# functorcoder
2
2
**functorcoder** is an open source AI coding assistant utilizing LLM (Large Language Model) with algebraic and modular design in Scala.js. It aims at providing a clean and extensible architecture for AI coding assistants, which is helpful for understanding basic mechanics if you want to build your own AI coding assistant.
Visit [vscode-scalajs-hello](https://github.com/doofin/vscode-scalajs-hello) to understand how to play with scala.js for VSCode extension development. Basically, sbt is used to build the project and run the extension.
14
+
Visit [vscode-scalajs-hello](https://github.com/doofin/vscode-scalajs-hello) to understand how to play with scala.js for VSCode extension development. Basically, sbt is used to build the project and run the extension. There you will learn:
15
+
- setting up the development environment
16
+
- building the project and running the extension
17
+
- packaging the extension
18
+
15
19
16
20
Before loading the extension, you need to add options to vscode user settings, and provide your OpenAI compatible API key and URL. Here is an example:
17
21
@@ -29,7 +33,7 @@ The project is divided into two main parts: the core module and the VSCode exten
29
33
30
34
**To get started**, read the file `extensionMain.scala` in the VSCode extension module. It is the main entry point for the extension.
31
35
32
-
The first part is the core module, containing the main logic of the AI coding assistant:
36
+
The first part is the core module, we aim keeping it concise. It contains the main logic of the ai coding assistant:
33
37
- Large Language Model (LLM) integration
34
38
- sending propmt to LLM and getting the response
35
39
@@ -46,23 +50,15 @@ project file structure for the core module:
46
50
```bash
47
51
/functorcoder
48
52
├── /src/main/scala/functorcoder
49
-
│ ├── /llm
50
-
│ │ ├── LLM.scala # Large Language Model (LLM) integration
53
+
│ ├── /llm # Integration with LLM (e.g., OpenAI API)
0 commit comments