Skip to content

Commit 0111d3e

Browse files
authored
Merge pull request #7 from codefuse-ai/cyz_dev
Add project codespaces config
2 parents 58d7bc0 + 4d43990 commit 0111d3e

File tree

3 files changed

+231
-0
lines changed

3 files changed

+231
-0
lines changed

.devcontainer/devcontainer.json

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
{
2+
"image": "ghcr.io/lonrun/codefuse-query-tutorial:0.5",
3+
"hostRequirements": {
4+
"cpus": 4
5+
},
6+
"customizations": {
7+
"codespaces": {
8+
"openFiles": ["tutorial/README.md"]
9+
}
10+
}
11+
}

tutorial/README.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
## CodeQuery使用教程
2+
3+
### 使用Github Codespaces 来体验CodeQuery分析能力
4+
5+
#### 步骤
6+
7+
- 在项目主页切换到目标分支
8+
- 创建 Codespaces,依次点击 Code -> Codespaces,在当前分支创建一个 Codespaces
9+
- 创建后,打开该 Codespaces,加载完成后,切换至项目 tutorial/notebook 目录下
10+
- 选择示例下的 jupyter notebook 分析教程,即可开始体验
11+
12+
#### 注意
13+
- 打开jupyter页面之后,如果是第一次加载容器,你还需要配置教程所使用的内核,在右上角 “选择内核”弹出框中,依次选择 “Juypter Kernel...” -> “Godel Kernel” 即可。

tutorial/notebook/go_analysis.ipynb

Lines changed: 207 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,207 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"这是一个使用CodeFuse-Query分析Go项目的教程。在教程中,你将体验到使用命令行工具对代码仓库进行数据化,然后使用Godel语言来分析这个仓库。"
8+
]
9+
},
10+
{
11+
"cell_type": "markdown",
12+
"metadata": {
13+
"collapsed": false
14+
},
15+
"source": [
16+
"检查cli是否就绪"
17+
]
18+
},
19+
{
20+
"cell_type": "code",
21+
"execution_count": null,
22+
"metadata": {
23+
"collapsed": false
24+
},
25+
"outputs": [],
26+
"source": [
27+
"!which sparrow"
28+
]
29+
},
30+
{
31+
"cell_type": "markdown",
32+
"metadata": {
33+
"collapsed": false
34+
},
35+
"source": [
36+
"STEP 0: 克隆要分析的仓库。我们以 [gorm](https://github.com/go-gorm/gorm.git) 项目为例。"
37+
]
38+
},
39+
{
40+
"cell_type": "code",
41+
"execution_count": null,
42+
"metadata": {
43+
"ExecuteTime": {
44+
"end_time": "2023-11-22T08:30:58.387715Z",
45+
"start_time": "2023-11-22T08:30:44.572634Z"
46+
}
47+
},
48+
"outputs": [],
49+
"source": [
50+
"!git clone https://github.com/go-gorm/gorm.git"
51+
]
52+
},
53+
{
54+
"cell_type": "markdown",
55+
"metadata": {},
56+
"source": [
57+
"STEP 1: 代码数据化。使用 `sparrow database create` 命令创建一个db文件,指定待分析的仓库地址(当前目录下的gorm子目录),分析的语言(go),以及db文件的存储路径(放置在当前目录下的/db/gorm)。执行该命令之后,竟会生成一份db文件,该文件存储着代码仓库的结构化数据,之后的分析就是针对这份数据进行。"
58+
]
59+
},
60+
{
61+
"cell_type": "code",
62+
"execution_count": null,
63+
"metadata": {
64+
"ExecuteTime": {
65+
"end_time": "2023-11-23T03:46:32.220317Z",
66+
"start_time": "2023-11-23T03:46:12.785705Z"
67+
}
68+
},
69+
"outputs": [],
70+
"source": [
71+
"!sparrow database create --source-root gorm --data-language-type go --output ./db/gorm"
72+
]
73+
},
74+
{
75+
"cell_type": "markdown",
76+
"metadata": {},
77+
"source": [
78+
"STEP 2: 使用Godel分析语言分析db文件。在本教程中,可以点击代码左侧的执行按钮,直接运行分析脚本。在命令行中,你可以使用 `sparrow query run` 命令来执行查询脚本,具体可以使用`sparrow query run -h` 来获取详细参数信息。"
79+
]
80+
},
81+
{
82+
"cell_type": "markdown",
83+
"metadata": {
84+
"collapsed": false
85+
},
86+
"source": [
87+
"<b>示例</b> 查询 [gorm](https://github.com/go-gorm/gorm.git) 的文件代码复杂度信息。\n",
88+
"\n",
89+
"第一行通过内核魔法命令指定分析的db路径,后面写查询文件代码复杂度Godel脚本。"
90+
]
91+
},
92+
{
93+
"cell_type": "code",
94+
"execution_count": null,
95+
"metadata": {},
96+
"outputs": [],
97+
"source": [
98+
"%db ./db/gorm\n",
99+
"// script\n",
100+
"use coref::go::*\n",
101+
"\n",
102+
"fn default_db() -> GoDB {\n",
103+
" return GoDB::load(\"coref_go_src.db\")\n",
104+
"}\n",
105+
"\n",
106+
"/**\n",
107+
" * @param name: 文件名\n",
108+
" * @param func: 函数名\n",
109+
" * @param cmplx: 函数圈复杂度\n",
110+
" * @param sl,el,sc,ec: 函数位置信息,依次为函数起始行,结束行\n",
111+
" */\n",
112+
"fn out(name: string, func: string, cmplx: int, sl: int, el: int) -> bool {\n",
113+
" for(f in GoFile(default_db()), function in Function(default_db())) {\n",
114+
" if ((!f.isAutoGenereatedFile()) &&\n",
115+
" f.key_eq(function.getBelongsFile()) &&\n",
116+
" name = f.getName() &&\n",
117+
" func = function.getName() &&\n",
118+
" cmplx = function.getCyclomaticComplexity() &&\n",
119+
" sl = function.getLocation().getStartLineNumber() &&\n",
120+
" el = function.getLocation().getEndLineNumber()) {\n",
121+
" return true\n",
122+
" }\n",
123+
" }\n",
124+
"}\n",
125+
"\n",
126+
"fn main() {\n",
127+
" output(out())\n",
128+
"}"
129+
]
130+
},
131+
{
132+
"cell_type": "markdown",
133+
"metadata": {},
134+
"source": [
135+
"保存上一次运行的 query 结果保存到一个JSON文件"
136+
]
137+
},
138+
{
139+
"cell_type": "code",
140+
"execution_count": null,
141+
"metadata": {},
142+
"outputs": [],
143+
"source": [
144+
"%%save_to ./query.json"
145+
]
146+
},
147+
{
148+
"cell_type": "markdown",
149+
"metadata": {
150+
"collapsed": false
151+
},
152+
"source": [
153+
"STEP 3: 好了,你可以针对分析生成的结果,进行进一步的代码分析了,比如你可以结合pandas库,使用刚刚生成的 query.json 实现最大函数复杂度的排序查询:"
154+
]
155+
},
156+
{
157+
"cell_type": "code",
158+
"execution_count": null,
159+
"metadata": {
160+
"ExecuteTime": {
161+
"end_time": "2023-11-23T03:54:56.998681Z",
162+
"start_time": "2023-11-23T03:54:56.976694Z"
163+
},
164+
"collapsed": false
165+
},
166+
"outputs": [],
167+
"source": [
168+
"%%python\n",
169+
"import pandas as pd\n",
170+
"data = pd.read_json('./query.json')\n",
171+
"data.sort_values('cmplx', ascending=False, inplace=True)\n",
172+
"top_10 = data.head(10)\n",
173+
"print(top_10)"
174+
]
175+
},
176+
{
177+
"cell_type": "markdown",
178+
"metadata": {
179+
"collapsed": false
180+
},
181+
"source": [
182+
"Enjoy!"
183+
]
184+
}
185+
],
186+
"metadata": {
187+
"kernelspec": {
188+
"display_name": "Godel kernel",
189+
"language": "rust",
190+
"name": "godel-jupyter"
191+
},
192+
"language_info": {
193+
"file_extension": ".gdl",
194+
"help_links": [
195+
{
196+
"text": "Godel kernel Magics",
197+
"url": "https://sparrow.alipay.com"
198+
}
199+
],
200+
"mimetype": "text/rust",
201+
"name": "rust",
202+
"version": "0.0.3"
203+
}
204+
},
205+
"nbformat": 4,
206+
"nbformat_minor": 2
207+
}

0 commit comments

Comments
 (0)