|
| 1 | +--- |
| 2 | +title: Specific usage |
| 3 | +--- |
| 4 | +import { Aside, CardGrid, Card, LinkCard, Steps, Badge } from '@astrojs/starlight/components'; |
| 5 | +import { Image } from 'astro:assets'; |
| 6 | + |
| 7 | +This Document provides a more in depth tutorial and demo for using the Continue VSCode extension with Db2 for i. |
| 8 | + |
| 9 | +## Getting Started: Continue |
| 10 | + |
| 11 | + |
| 12 | +Continue is the leading open source AI code assistant for VS Code. It provides a wide range of AI features: |
| 13 | + |
| 14 | +* Chat Interface |
| 15 | +* Code Completion |
| 16 | +* Autocomplete |
| 17 | + |
| 18 | +### Install the Continue extension for VS Code |
| 19 | + |
| 20 | +<Steps> |
| 21 | + 1. Install the Continue extension from the [VS Code Marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue). |
| 22 | +  |
| 23 | + 2. Once Installed, there will be a new icon in your VS Code menu (mine is on the top right). Click on the icon to open the chat window. |
| 24 | +  |
| 25 | +</Steps> |
| 26 | + |
| 27 | +Once you have the extension installed, you can configure the AI provider you want to use. Continue supports multiple AI providers (including [Watsonx](https://docs.continue.dev/customize/model-providers/more/watsonx)!). You can choose the provider you want to use by clicking on the settings icon in the chat window. |
| 28 | + |
| 29 | +For demonstration purposes, we will use the Ollama Provider for hosting LLMs locally on your machine. |
| 30 | + |
| 31 | + |
| 32 | +### Setting up Ollama Provider |
| 33 | + |
| 34 | +Here is a step-by-step guide to setting up the Ollama provider with the IBM Granite models in Continue: |
| 35 | + |
| 36 | +#### 1. Install Ollama |
| 37 | + |
| 38 | +Install Ollama on your machine by following the link below: |
| 39 | +<LinkCard title="Install Ollama" href="https://ollama.com/download" /> |
| 40 | + |
| 41 | +#### 2. Fetch the IBM Granite 3.0 models |
| 42 | + |
| 43 | +The IBM Granite 3.0 models are available in the Ollama model registry. More information about the IBM Granite models can be found [here](https://ollama.com/blog/ibm-granite). |
| 44 | + |
| 45 | +Using the Ollama CLI, fetch the IBM Granite 3.0 8b model by running the following command: |
| 46 | +```bash |
| 47 | +ollama pull granite3-dense:8b |
| 48 | +``` |
| 49 | + |
| 50 | +#### 3. Configure the Ollama provider in Continue |
| 51 | + |
| 52 | +Open the VSCode Command Palette (Press ctrl+shift+p) and search for `Continue: open config.json`. This will open the Continue central config file `$HOME/.continue/config.json` in your editor. To enable the Granite models in Ollama, add the following configuration to the `models` section: |
| 53 | + |
| 54 | +```json title="~/.continue/config.json" |
| 55 | +"models": [ |
| 56 | + { |
| 57 | + "title": "Granite Code 8b", |
| 58 | + "provider": "ollama", |
| 59 | + "model": "granite3-dense:8b" |
| 60 | + } |
| 61 | + ], |
| 62 | +``` |
| 63 | + |
| 64 | +save this file and select the Granite model in the chat window. |
| 65 | + |
| 66 | + |
| 67 | + |
| 68 | +### Other LLM Providers |
| 69 | + |
| 70 | +You can also use other LLM providers with Continue. Here are some of the available providers: |
| 71 | +- Watsonx |
| 72 | +- Anthropic |
| 73 | +- OpenAI |
| 74 | +- Gemini (Google) |
| 75 | + |
| 76 | +<Aside type="note"> |
| 77 | + Although most models are good with SQL, some models are better that others. Here are some other models that work well with the Db2 for i assistant: |
| 78 | + - Llama3.3 |
| 79 | + - Claude 3.5 |
| 80 | + - Claude 3.7 |
| 81 | + - Mistral Large |
| 82 | + |
| 83 | +You can find more information for setting up additional models in Continue's [documentation](https://docs.continue.dev/customize/model-providers). |
| 84 | +</Aside> |
| 85 | + |
| 86 | + |
| 87 | +## Examples |
| 88 | + |
| 89 | +Once you have the extension installed and the AI provider configured, you can ask questions about your database using the chat window using the `@db2i` context provider. In Continue, a context provider is very similar to a chat participant in GitHub Copilot. It provides additional context to the AI model to help it generate more accurate SQL queries. |
| 90 | + |
| 91 | +More on context providers can be found [here](https://docs.continue.dev/customize/context-providers/). |
| 92 | + |
| 93 | +### Working with Tables |
| 94 | + |
| 95 | +#### Example 1: Summarize the columns in the `EMPLOYEE` table |
| 96 | + |
| 97 | + |
| 98 | + |
| 99 | +**Notes:** |
| 100 | +- The AI model recognizes the table reference `EMPLOYEE` and provides a summary of the columns in the table. |
| 101 | +- Primary Key and Constraint information is also provided. |
| 102 | + |
| 103 | + |
| 104 | +#### Example 2: Join the `EMPLOYEE` and `DEPARTMENT` tables |
| 105 | + |
| 106 | + |
| 107 | + |
| 108 | +**Notes:** |
| 109 | +- The AI model recognizes the table references `EMPLOYEE` and `DEPARTMENT` and provides a SQL query that joins the two tables. |
| 110 | +- The SQL query is generated based on the context provided by the `@db2i` context provider. |
| 111 | +- The generated SQL query can be copied and run in the SQL editor in VS Code. |
| 112 | + |
| 113 | + |
| 114 | +#### Example 3: More complex queries |
| 115 | + |
| 116 | + |
| 117 | +**Notes:** |
| 118 | +- The AI model recognizes the table references `EMPLOYEE` and `DEPARTMENT` and provides a SQL query that calculates the total, average, and median salary for each department. |
| 119 | +- run the generated SQL query in the SQL editor in VS Code. |
| 120 | + |
| 121 | +We can refine this query further by asking to remove departments that dont have any employees: |
| 122 | + |
| 123 | + |
| 124 | + |
| 125 | +#### Tables context provider: |
| 126 | + |
| 127 | +When you connect to a system and start an SQL Job, we automatically create a default schema contex provider with the tables in that schema. This allows you to ask questions about the tables in the schema without having to reference the schema in your queries. |
| 128 | + |
| 129 | +My defult Library is `SAMPLE`, so `@Db2i-SAMPLE` is the context provider for the tables in the SAMPLE library: |
| 130 | + |
| 131 | + |
| 132 | +Once you select `@Db2i-SAMPLE`, the tables appear in a dopdown list. You can select a table to get a summary of the columns in that table: |
| 133 | + |
| 134 | + |
| 135 | +Example Prompt: |
| 136 | + |
| 137 | + |
| 138 | +Note, you do not need to invoke `@Db2i` in order to use the table context items. One advantage to "tagging" the direct table, is that we only look up information for that table, and dont search the entire library list for the table. |
| 139 | + |
| 140 | +You can add additional table context providers by editing your library list in the SQL Job Manager. |
| 141 | +1. Open the SQL Job Manager |
| 142 | +2. Select New SQL Job, or edit the current Job |
| 143 | +3. Add the library as the first entry in the library list |
| 144 | +4. Save the Job |
| 145 | + |
| 146 | +In the following image, I added `TOYSTORE3` to the library list, and now I can use the `@Db2i-TOYSTORE3` context provider to get information about the tables in the TOYSTORE3 library: |
| 147 | + |
| 148 | + |
| 149 | + |
| 150 | + |
| 151 | +### Working with other references in your Library List |
| 152 | + |
| 153 | + |
| 154 | +#### Example 1: Reference a function in QSYS2 |
| 155 | + |
| 156 | + |
| 157 | +**Notes:** |
| 158 | +- The model recognizes the function reference `OBJECT_STATISTICS` in the `QSYS2` library and provides a summary of the function. |
| 159 | +- Information about the `TOYSTORE3` library is also provided. |
| 160 | +- An example of how to use `OBJECT_STATISTICS` is provided. |
| 161 | + |
| 162 | + |
| 163 | + |
| 164 | + |
0 commit comments