Skip to content

Commit

Permalink
add okai pages
Browse files Browse the repository at this point in the history
  • Loading branch information
mythz committed Feb 8, 2025
1 parent 7b0eaa7 commit e66703b
Show file tree
Hide file tree
Showing 4 changed files with 1,169 additions and 0 deletions.
114 changes: 114 additions & 0 deletions MyApp/_pages/autoquery/okai-chat.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
---
title: Free LLM Chat Prompts
---

## okai chat

As part of the development of [okai](/autoquery/okai-models) for generating [Blazor CRUD Apps from a text prompt](/autoquery/text-to-blazor) using your preferred AI Models, we've also made available a generic **chat** prompt that can be used as a convenient way to conduct personal research against many of the worlds most popular Large Language Models - for Free!

You can just start immediately using the `npx okai chat` script to ask LLMs for assistance:

:::sh
npx okai chat "command to copy a folder with rsync?"
:::

This will use the default model (currently codestral:22b) to answer your question.

### Select Preferred Model

You can also use your preferred model with the `-m <model>` flag with either the model **name** or its **alias**, e.g you can use
[Microsoft's PHI-4 14B](https://techcommunity.microsoft.com/blog/aiplatformblog/introducing-phi-4-microsoft%E2%80%99s-newest-small-language-model-specializing-in-comple/4357090)
model with:

:::sh
npx okai -m phi chat "command to copy folder with rsync?"
:::

### List Available Models

We're actively adding more great performing and leading experimental models as they're released.
You can view the list of available models with `ls models`:

:::sh
npx okai ls models
:::

Which at this time will return the following list of available models along with instructions for how to use them:

```txt
USAGE (5 models max):
a) OKAI_MODELS=codestral,llama3.3,flash
b) okai -models codestral,llama3.3,flash <prompt>
c) okai -m flash chat <prompt>
FREE MODELS:
claude-3-haiku (alias hakiu)
codestral:22b (alias codestral)
deepseek-r1:70b
deepseek-v3:671b (alias deepseek)
gemini-flash-1.5
gemini-flash-1.5-8b (alias flash-8b)
gemini-flash-2.0 (alias flash)
gemini-flash-lite-2.0 (alias flash-lite)
gemini-flash-thinking-2.0 (alias flash-thinking)
gemini-pro-2.0 (alias gemini-pro)
gemma2:9b (alias gemma)
gpt-3.5-turbo (alias gpt-3.5)
gpt-4o-mini
llama3.1:70b (alias llama3.1)
llama3.3:70b (alias llama3.3)
llama3:8b (alias llama3)
mistral-nemo:12b (alias mistral-nemo)
mistral-small:24b (alias mistral-small)
mistral:7b (alias mistral)
mixtral:8x22b
mixtral:8x7b (alias mixtral)
nova-lite
nova-micro
phi-4:14b (alias phi,phi-4)
qwen-plus
qwen-turbo
qwen2.5-coder:32b (alias qwen2.5-coder)
qwen2.5:72b (alias qwen2.5)
qwq:32b (alias qwq)
qwq:72b
PREMIUM MODELS: *
claude-3-5-haiku
claude-3-5-sonnet
claude-3-sonnet
deepseek-r1:671b (alias deepseek-r1)
gemini-pro-1.5
gpt-4
gpt-4-turbo
gpt-4o
mistral-large:123b
nova-pro
o1-mini
o1-preview
o3-mini
qwen-max
* requires valid license:
a) SERVICESTACK_LICENSE=<key>
b) SERVICESTACK_CERTIFICATE=<LC-XXX>
c) okai -models <premium,models> -license <license> <prompt>
```

Where you'll be able to use any of the great performing inexpensive models listed under `FREE MODELS` for Free.
Whilst ServiceStack customers with an active commercial license can also use any of the more expensive
and better performing models listed under `PREMIUM MODELS` by either:

a) Setting the `SERVICESTACK_LICENSE` Environment Variable with your **License Key**
b) Setting the `SERVICESTACK_CERTIFICATE` Variable with your **License Certificate**
c) Inline using the `-license` flag with either the **License Key** or **Certificate**

### FREE for Personal Usage

To be able to maintain this as a free service we're limiting usage as a tool that developers can use for personal
assistance and research by limiting usage to **60 requests /hour** which should be more than enough for most
personal usage and research whilst deterring usage in automated tools.

:::tip info
Rate limiting is implemented with a sliding [Token Bucket algorithm](https://en.wikipedia.org/wiki/Token_bucket) that replenishes 1 additional request every 60s
:::
130 changes: 130 additions & 0 deletions MyApp/_pages/autoquery/okai-db.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
---
title: Generate CRUD APIs and UIs from existing DBs
---

A core piece of functionality in the [Text to Blazor CRUD App](/autoquery/text-to-blazor) feature is distilling an AI Prompt into TypeScript classes that can be [further customized](/autoquery/okai-models#customize-data-models)
to generate AutoQuery CRUD APIs and Admin UIs for managing the underlying RDBMS tables.

## TypeScript Data Models

Using TypeScript is an effortless way to define data models, offering a DSL-like minimal boilerplate format that's human-friendly to read and write which can leverage TypeScript's powerful Type System is validated against the referenced [api.d.ts](https://okai.servicestack.com/api.d.ts) schema to provide a rich authoring experience
with strong typing and intellisense - containing all the C# Types, interfaces, and attributes used in defining APIs, DTOs and Data Models.

### Blueprint for Code Generation

The TypeScript Data Models serve as the blueprint for generating everything needed to support the feature
in your App, including the AutoQuery **CRUD APIs**, **Admin UIs** and **DB Migrations** that can re-create the necessary tables from scratch.

## 1. Generate RDBMS Metadata

The first step in generating TypeScript Data Models is to capture the metadata from the existing RDBMS tables which
we can do with the `App.json` [AppTask](https://docs.servicestack.net/app-tasks) below which uses your App's configured
RDBMS connection to generate the Table Definitions for all tables in the specified RDBMS connection and schema
to the file of your choice (e.g `App_Data/App.json`):

```csharp
AppTasks.Register("App.json", args =>
appHost.VirtualFiles.WriteFile("App_Data/App.json",ClientConfig.ToSystemJson(
migrator.DbFactory.GetTables(namedConnection:null, schema:null))));
```

This task can then be run from the command line with:

:::sh
dotnet run --AppTasks=App.json
:::

Which generates `App_Data/App.json` containing the table definition metadata for all tables in
the specified RDBMS, e.g:

```json
[
{
"name": "AspNetUserClaims",
"columns": [
{
"columnName": "Id",
"columnOrdinal": 0,
"columnSize": -1,
"numericPrecision": 0,
"numericScale": 0,
"isUnique": true,
"isKey": true,
"baseCatalogName": "techstacks",
"baseColumnName": "Id",
"baseSchemaName": "public",
"baseTableName": "AspNetUserClaims",
"dataType": "System.Int32",
"allowDBNull": false,
"providerType": 9,
"isAliased": false,
"isExpression": false,
"isAutoIncrement": true,
"isRowVersion": false,
"isHidden": false,
"isLong": false,
"isReadOnly": false,
"dataTypeName": "integer",
"columnDefinition": "INTEGER PRIMARY KEY AUTOINCREMENT"
},
],
...
]
```

### Different Connection or DB Schema

If you prefer to generate the metadata for a different connection or schema, you can create a new AppTask
with your preferred `namedConnection` and/or `schema`, e.g:

```csharp
AppTasks.Register("Sales.json", args =>
appHost.VirtualFiles.WriteFile("Sales.json", ClientConfig.ToSystemJson(
migrator.DbFactory.GetTables(namedConnection:"reports",schema:"sales"))));
```

That you could then generate with:

:::sh
dotnet run --AppTasks=Sales.json
:::

## 2. Generate TypeScript Data Models

The next step is to generate TypeScript Data Models from the captured metadata which can be done with the `okai` tool
by running the `convert` command with the path to the `App.json` JSON table definitions which will generate the
TypeScript Data Models to stdout which can be redirected to a file in your **ServiceModel** project, e.g:

:::sh
npx okai convert App_Data/App.json > ../MyApp.ServiceModel/App.d.ts
:::

## 3. Generate CRUD APIs and Admin UIs

The data models defined in the `App.d.ts` TypeScript Declaration file is what drives the generation of the Data Models, APIs, DB Migrations and Admin UIs. This can be further customized by editing the TypeScript Declaration file and re-running the `okai` tool with just the filename, e.g:

:::sh
npx okai App.d.ts
:::

Which will re-generate the Data Models, APIs, DB Migrations and Admin UIs based on the updated Data Models.

![](/img/posts/okai-models/npx-okai-App.png)

:::tip
You only need to specify the `App.d.ts` TypeScript filename (i.e. not the filepath) from
anywhere within your .NET solution
:::

## Live Code Generation

If you'd prefer to see the generated code in real-time you can add the `--watch` flag to watch the
TypeScript Declaration file for changes and automatically re-generate the generated files on Save:

:::sh
npx okai App.d.ts --watch
:::

<video autoplay="autoplay" loop="loop" controls>
<source src="https://media.servicestack.com/videos/okai-watch.mp4" type="video/mp4">
</video>
Loading

0 comments on commit e66703b

Please sign in to comment.