Commit 9eae5efd authored by duanjinfei's avatar duanjinfei

init commit

parents
Pipeline #796 failed with stages
node_modules
.docusaurus
.vscode
build
# How to contribute
:+1::tada: First off, thanks for taking the time to contribute! :tada::+1:
It's people like you that make security open source such a force in preventing
successful cyber-attacks. Following these guidelines helps keep the project
maintainable, easy to contribute to, and more secure. Thank you for taking the
time to follow this guide.
## Where to start
There are many ways to contribute. You can fix a bug, improve the documentation,
submit bug reports and feature requests, or take a first shot at a feature you
need for yourself.
Pull requests are necessary for all contributions of code or documentation.
## New to open source?
If you're **new to open source** and not sure what a pull request is, welcome!!
We're glad to have you! All of us once had a contribution to make and didn't
know where to start.
Even if you don't write code for your job, don't worry, the skills you learn
during your first contribution to open source can be applied in so many ways,
you'll wonder what you ever did before you had this knowledge. It's worth
learning.
[Learn how to make a pull request](https://github.com/PaloAltoNetworks/.github/blob/master/Learn-GitHub.md#learn-how-to-make-a-pull-request)
## Fixing a typo, or a one or two line fix
Many fixes require little effort or review, such as:
> - Spelling / grammar, typos, white space and formatting changes
> - Comment clean up
> - Change logging messages or debugging output
These small changes can be made directly in GitHub if you like.
Click the pencil icon in GitHub above the file to edit the file directly in
GitHub. This will automatically create a fork and pull request with the change.
See:
[Make a small change with a Pull Request](https://www.freecodecamp.org/news/how-to-make-your-first-pull-request-on-github/)
## Bug fixes and features
For something that is bigger than a one or two line fix, go through the process
of making a fork and pull request yourself:
> 1. Create your own fork of the code
> 2. Clone the fork locally
> 3. Make the changes in your local clone
> 4. Push the changes from local to your fork
> 5. Create a pull request to pull the changes from your fork back into the
> upstream repository
Please use clear commit messages so we can understand what each commit does.
We'll review every PR and might offer feedback or request changes before
merging.
MIT License
Copyright (c) 2022 Palo Alto Networks
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
# Template
This template is built for [Docusaurus 2](https://docusaurus.io/), a modern static website generator.
### Usage
```bash
npx create-docusaurus@2.4.3 my-website --package-manager yarn
```
> When prompted to select a template choose `Git repository`.
Template Repository URL:
```bash
https://github.com/PaloAltoNetworks/docusaurus-template-openapi-docs.git
```
> When asked how the template repo should be cloned choose "copy" (unless you know better).
```bash
cd my-website
yarn
```
### Local Development
```bash
yarn start
```
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
### Build
```bash
yarn build
```
This command generates static content into the `build` directory and can be served using any static contents hosting service.
Community Supported
The software and templates in the repo are released under an as-is, best effort,
support policy. This software should be seen as community supported and Palo
Alto Networks will contribute our expertise as and when possible. We do not
provide technical support or help in using or troubleshooting the components of
the project through our normal support options such as Palo Alto Networks
support teams, or ASC (Authorized Support Centers) partners and backline support
options. The underlying product used (the VM-Series firewall) by the scripts or
templates are still supported, but the support is only for the product
functionality and not for help in deploying or using the template or script
itself. Unless explicitly tagged, all projects or work posted in our GitHub
repository (at https://github.com/PaloAltoNetworks) or sites other than our
official Downloads page on https://support.paloaltonetworks.com are provided
under the best effort policy.
---
sidebar_position: 1
title: AON Web3 AI Application Development Documentation
slug: glLINbO4aapcxNlBi7Asm
createdAt: Wed Dec 14 2022 06:30:46 GMT+0000 (Coordinated Universal Time)
updatedAt: Fri Jul 19 2024 03:44:07 GMT+0000 (Coordinated Universal Time)
---
# AON (AGI Open Network) Developer Documentation Introduction Page
## The Best Way to Quickly Build and Share Web3 AI Apps
AON (AGI Open Network) is an AI application blockchain. By opening up the innovative protocol stack of the AI value layer, as well as distributed idle computing power access, large model inference API platform, and D'AI-Store and other infrastructures, developers can rapidly construct blockchain-driven AI applications, speeding up the innovation of Web3 AI globally.
### Use AI API
### Vision
To build an open and collaborative Web3 AI technology ecosystem to promote the integration and innovation of AI and Web3 worldwide.
### Mission
Utilizing the modular and multi-layered AON protocol stack, blockchain technology is used to unlock the commercial potential of AI technology, ensuring fair distribution of value and assetization of AI applications.
## Partners
### Blockchain, Protocol Stack, and Ecosystem
### Blockchain
AON is an Ethereum two-layer public chain that adopts zero knowledge proof technology and focuses on AI applications, with the advantages of high security, high performance, and low Gas fees.
### Protocol Stack
The AI protocol uses blockchain technology to confirm ownership, ensure reasonable distribution of benefits, establish a deflationary economic model, and form an ecological system.
### Ecosystem
The AI ecosystem is aimed at three core roles: developers, users, and idle computing power providers. AON Token Incentives promote the benefits of multiple parties.
## Core Technical Advantages and Product Features
### Access Distributed Idle Computing Power
### AI Open Source Model API Platform
### D'AI-STORE
### AI Open Source Model API Platform
Quickly build AI applications by using various AI models in code.
- Stable-Diffusion
- Face-To-Many
- Llama3
- Incredibly-Fast-Whisper
### D'AI-STORE
D'AI-STORE provides an SDK for quickly intervening in the WEB user system. Through various on-chain protocols such as computing power casting and application payment, it enhances the commercial value of AI applications and quickly realizes the value of applications.
## Ecosystem Participant Network
In the AON network, everyone can find a suitable role, whether they are highly skilled developers, experienced participants, or blockchain beginners. Come and learn how to participate and get rewards.
### Developers
Include blockchain developers, large model developers, and AGI product developers, who are responsible for building and maintaining the underlying protocols and applications in the ecosystem. Developers can make profits through sales or subscriptions in the application market and receive technical support and training to reduce development difficulty and improve efficiency. Contributors are rewarded through incentives such as token distribution and participation income.
### Users
Users can not only use products but also participate in specific activities to accumulate tokens, enhance the diversity and vitality of the ecosystem, and use the accumulated tokens to use more AI applications for free or at a lower cost.
### Idle Computing Power Providers
Individuals or institutions provide computing power resources to support the inference calculation of large models and AGI products and are responsible for collecting and providing high-quality data required for training large models. Token rewards are obtained based on service time and quality, and continuous provision of computing power is encouraged. Based on the amount of data calls, providers are rewarded with tokens to ensure the maximization of data value.
## Come Join Our Vibrant Community!
Start learning how to build your first Web3 AI application now.
Quickly experience a large number of D'AI apps.
© 2024 AON Foundation All Rights Reserved
---
title: AI Face Swap API Usage Guide
slug: eC4eIgRJKwIGIsmMqAH4l
createdAt: Thu Jul 18 2024 06:04:57 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:43:11 GMT+0000 (Coordinated Universal Time)
---
# AI Face Swap API Usage Guide
## Introduction
This document will guide developers on how to use the aonet library to call the AI Face Swap API.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Steps to Use
### 1. Import the aonet library
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object that includes your APPID:
```js
const options = {
appid: "your APPID"
};
```
Make sure to replace "your APPID" with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```js
const aonet = new AI(options);
```
### 4. Call the Face Swap API
Use the `prediction` method to call the Face Swap API:
```js
async function performFaceSwap() {
try {
let response = await aonet.prediction("/predictions/ai/face-swap",
{
input: {
"swap_image": "https://aonet.ai/pbxt/JoBuzfSVFLb5lBqkf3v9xMnqx3jFCYhM5JcVInFFwab8sLg0/long-trench-coat.png",
"target_image": "https://replicate.delivery/pbxt/JoBuz3wGiVFQ1TDEcsGZbYcNh0bHpvwOi32T1fmxhRujqcu7/9X2.png"
}
});
console.log("Face swap result:", response);
} catch (error) {
console.error("Error performing face swap:", error);
}
}
performFaceSwap();
```
### Parameter Description
- `swap_image`: URL of the image containing the face to be swapped.
- `target_image`: URL of the target image where the face will be swapped.
### Notes
- Ensure that the provided image URLs are publicly accessible.
- The API may take some time to process the images, consider implementing appropriate wait or retry logic.
- Handle possible errors, such as network issues or API limitations.
### Example Response
The API response will contain the URL of the processed image or other relevant information. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Consider implementing error retry mechanisms.
- Add image validation logic to ensure the provided URLs point to valid image files.
- For production environments, consider implementing rate limiting and caching mechanisms to optimize API usage.
---
title: AnyText API Usage Guide
slug: 0tF--any
createdAt: Tue Jul 30 2024 05:31:14 GMT+0000 (Coordinated Universal Time)
updatedAt: Wed Jul 31 2024 09:01:31 GMT+0000 (Coordinated Universal Time)
---
# AnyText API Usage Guide
## Introduction
This document will guide developers on how to use the `aonet` library to invoke the AnyText API, which is used for text generation and editing before rendering in images(Text).
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure the `aonet` library is installed. If not, you can install it using npm:
```bash
npm install aonet
```
## Usage Instructions
### 1. Import the `aonet` Library
```javascript
const AI = require("aonet");
```
### 2. Configure Options
Create an `options` object containing your APPID:
```javascript
const options = {
appid: "your_APPID"
};
```
Make sure to replace `"your_APPID"` with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```javascript
const aonet = new AI(options);
```
### 4. Invoke AnyText API
Use the `prediction` method to call the FunASR API:
```javascript
async function performSpeechRecognition() {
try {
let response = await aonet.prediction("/predictions/ai/anytext", {
input: {
"mode": "text-generation",
"prompt": "photo of caramel macchiato coffee on the table, top-down perspective, with \"Any\" \"Text\" written on it using cream",
"seed": 200,
"draw_pos": "https://replicate.delivery/pbxt/LIHKXdjxOWFe7HqP1rliIsghRab48EVQRzGNwQ9RgyO5V03d/gen9.png",
"ori_image": "https://replicate.delivery/pbxt/LIHMZ8cCvmndHNVufiSuKZA4mnokuSOy87cYqhvs4Diei7sL/edit9.png",
"img_count": 2,
"ddim_steps": 20,
"use_fp32": false,
"no_translator": false,
"strength": 1,
"img_width": 512,
"img_height": 512,
"cfg_scale": 9,
"a_prompt": "best quality, extremely detailed,4k, HD, supper legible text, clear text edges, clear strokes, neat writing, no watermarks",
"n_prompt": "low-res, bad anatomy, extra digit, fewer digits, cropped, worst quality, low quality, watermark, unreadable text, messy words, distorted text, disorganized writing, advertising picture",
"sort_radio": "",
"revise_pos": false
}
});
console.log("AnyText result:", response);
} catch (error) {
console.error("Error performing speech recognition:", error);
}
}
performSpeechRecognition();
```
### Parameter Description
- `mode`: str, Indicates a model that needs to be called, fixed value.
- `prompt`: str, Tips, describe the content of the image
- `seed`: int,The number of seeds, range -1 \~ 99999999.
- `draw_pos`: url,A URL address of an image indicating the positions of the generated text. 
- `ori_image`: url,A URL address to be edited.
- `img_count`: int,Number of images to generate, range 1–16.
- `ddim_steps`: int,The number of sampling steps, must be within the range of 1 to 100.​
- `use_fp32`: bool,
- `no_translator`: bool,No Translator
- `strength`: float,The control strength of the text control module, must be within the range of 0.0 to 2.0.​
- `img_width`: int,Image width, valid only in “text generation” mode, must be within the range of 256px to 768px.​
- `img_height`: int,Image height, valid only in “text generation” mode, must be within the range of 256px to 768px.
- `cfg_scale`: float,Classifier-Free Guidance (CFG) strength parameter, range 0.1–30.0.
- `a_prompt`: str,Additional prompt words, typically used to enhance the image effect.
- `n_prompt`: str,Negative prompt words.
- `sort_radio`: str,Sort Position,position sorting priority
- `revise_pos`: bool,Revise Position
## Considerations
- Ensure the provided image URL is publicly accessible and of good quality for optimal recognition results.
- The API may take some time to process the image and generate results,
- Handle potential errors, such as network issues, invalid input, or API limitations.
- Adhere to terms of use and privacy regulations, especially when processing image containing sensitive information.
- Enter the descriptive prompts (supporting both Chinese and English) in the Prompt. Each line of text to be generated should be enclosed in double quotes, then hand-draw the position for each line of text in sequence to generate an image. The quality of the generated image depends critically on the drawing of the text positions, so please do not draw them too casually or too small. The number of positions must match the number of text lines, and each position’s size should match the length or height of the corresponding text line as closely as possible.
## Example Response
The API response will include the image content after text generation or text editing. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement batch image processing by processing multiple image files in a loop or concurrently.
- Add a user interface to allow users to upload their image files or provide image URLs.
- Implement real-time tex recognition by integrating the API into live image streams.
- Integrate post-processing features for text, such as punctuation addition, semantic analysis, or sentiment analysis.
- Consider implementing multi-language support to handle image in different languages as needed.
By following this guide, you should be able to effectively use the AnyText API for automatic speech recognition in your applications. If you have any questions or need further clarification, feel free to ask.
---
title: FunASR API Usage Guide
slug: gKriAn53WLFxGYd6BwWvy
createdAt: Thu Jul 18 2024 06:17:27 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 06:42:25 GMT+0000 (Coordinated Universal Time)
---
# FunASR API Usage Guide
## Introduction
This document will guide developers on how to use the `aonet` library to invoke the FunASR API, which is used for Automatic Speech Recognition (ASR).
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure the `aonet` library is installed. If not, you can install it using npm:
```bash
npm install aonet
```
## Usage Instructions
### 1. Import the `aonet` Library
```javascript
const AI = require("aonet");
```
### 2. Configure Options
Create an `options` object containing your APPID:
```javascript
const options = {
appid: "your_APPID"
};
```
Make sure to replace `"your_APPID"` with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```javascript
const aonet = new AI(options);
```
### 4. Invoke FunASR API
Use the `prediction` method to call the FunASR API:
```javascript
async function performSpeechRecognition() {
try {
let response = await aonet.prediction("/predictions/ai/funasr", {
input: {
"awv": "https://aonet.ai/mgxm/d9fa255c-4c47-4fec-99ce-f190539f10c4/olle.mp3",
"batch_size": 300
}
});
console.log("FunASR result:", response);
} catch (error) {
console.error("Error performing speech recognition:", error);
}
}
performSpeechRecognition();
```
### Parameter Description
- `awv`: String, specifies the URL of the audio file to be recognized.
- `batch_size`: Integer, specifies the batch size for processing the audio. This may affect processing speed and memory usage.
## Considerations
- Ensure the provided audio URL is publicly accessible and of good quality for optimal recognition results.
- The API may take some time to process the audio and generate results, especially for longer audio files. Consider implementing appropriate waiting or loading states.
- Handle potential errors, such as network issues, invalid input, or API limitations.
- Adhere to terms of use and privacy regulations, especially when processing audio containing sensitive information.
## Example Response
The API response will contain the recognized text content. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement batch audio processing by processing multiple audio files in a loop or concurrently.
- Add a user interface to allow users to upload their audio files or provide audio URLs.
- Implement real-time speech recognition by integrating the API into live audio streams.
- Integrate post-processing features for text, such as punctuation addition, semantic analysis, or sentiment analysis.
- Consider implementing multi-language support to handle audio in different languages as needed.
By following this guide, you should be able to effectively use the FunASR API for automatic speech recognition in your applications. If you have any questions or need further clarification, feel free to ask.
---
title: IDM-VTON AI Model Usage Guide
slug: TaQ6XsKxYhbY62kmBLv72
createdAt: Thu Jul 18 2024 05:50:21 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:44:27 GMT+0000 (Coordinated Universal Time)
---
# IDM-VTON AI Model Usage Guide
## Introduction
This document describes how to use the aonet library to call the IDM-VTON AI model. This model is used for virtual try-on, allowing specified clothing images to be applied to human images.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Basic Usage
### 1. Import the AI Class
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object containing authentication information:
```js
const options = {
appId: "$APPID_KEY" // Replace with your APPID key
};
```
### 3. Initialize AI Instance
```js
const aonet = new AI(options);
```
### 4. Call the IDM-VTON Model
Use the `prediction` method to call the model:
```js
async function callIDMVTON() {
try {
let response = await aonet.prediction("/predictions/ai/idm-vton", {
input: {
"seed": 42,
"steps": 30,
"garm_img": "https://aonet.ai/pbxt/KgwTlZyFx5aUU3gc5gMiKuD5nNPTgliMlLUWx160G4z99YjO/sweater.webp",
"human_img": "https://replicate.delivery/pbxt/KgwTlhCMvDagRrcVzZJbuozNJ8esPqiNAIJS3eMgHrYuHmW4/KakaoTalk_Photo_2024-04-04-21-44-45.png",
"garment_des": "cute pink top"
}
});
console.log("IDM-VTON Response:", response);
return response;
} catch (error) {
console.error("Error calling IDM-VTON:", error);
throw error;
}
}
```
### Parameter Description
- `seed`: Random seed for generating reproducible results
- `steps`: Number of processing steps
- `garm_img`: URL of the garment image
- `human_img`: URL of the human image
- `garment_des`: Description of the garment
### Handling the Response
The model's response will contain the processed results. Depending on your application's needs, you may need to parse and use specific fields from the response.
### Error Handling
Use try-catch blocks to catch and handle possible errors.
## Best Practices
- **Store and Manage API Keys**: Do not hard-code API keys in your code. Use environment variables or secure key management systems.
- **Input Validation**: Validate all input parameters before sending requests.
- **Error Handling**: Implement comprehensive error handling, including network errors, API limits, and invalid responses.
- **Caching Strategy**: Consider implementing caching mechanisms to reduce duplicate requests and improve application performance.
- **Asynchronous Processing**: Use async/await or Promises to handle asynchronous operations, ensuring the main thread is not blocked.
### Notes
- Ensure you have enough API call quota.
- Ensure the validity and accessibility of image URLs.
- Adhere to the API provider's terms of use and restrictions.
## Example Code
```js
const AI = require("aonet");
async function runIDMVTON() {
const options = {
auth: process.env.AONET_API_KEY // Store API key in environment variable
};
const aonet = new AI(options);
try {
const response = await aonet.prediction("/predictions/ai/idm-vton", {
input: {
"seed": 42,
"steps": 30,
"garm_img": "https://example.com/garment.jpg",
"human_img": "https://example.com/person.jpg",
"garment_des": "elegant blue dress"
}
});
console.log("IDM-VTON Result:", response);
// Further processing of the response...
} catch (error) {
console.error("Error in IDM-VTON process:", error);
// Error handling...
}
}
runIDMVTON();
```
## Conclusion
By following this guide, you should be able to successfully integrate and use the IDM-VTON AI model for virtual try-on application development. If you encounter any issues or need further assistance, please refer to the official aonet documentation or contact technical support.
---
title: LLaMA 3 API Usage Guide
slug: 1RnP-Ug_aRMwdOWHQZZvz
createdAt: Thu Jul 18 2024 06:15:54 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:38:59 GMT+0000 (Coordinated Universal Time)
---
# LLaMA 3 API Usage Guide
## Introduction
This document will guide developers on how to use the aonet library to call the LLaMA 3 API for generating natural language text.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Steps to Use
### 1. Import the aonet library
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object that includes your APPID:
```js
const options = {
appid: "your APPID"
};
```
Make sure to replace "your APPID" with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```js
const aonet = new AI(options);
```
### 4. Call the LLaMA 3 API
Use the `prediction` method to call the LLaMA 3 API:
```js
async function generateText() {
try {
let response = await aonet.prediction("/predictions/ai/lllama3:0.0.8",
{
input: {
"top_p": 1,
"prompt": "Plan a day of sightseeing for me in San Francisco.",
"temperature": 0.75,
"system_prompt": "You are an old-timey gold prospector who came to San Francisco for the gold rush and then was teleported to the present day. Despite being from 1849, you have great knowledge of present-day San Francisco and its attractions. You are helpful, polite, and prone to rambling.",
"max_new_tokens": 800,
"repetition_penalty": 1
}
});
console.log("LLaMA 3 result:", response);
} catch (error) {
console.error("Error generating text:", error);
}
}
generateText();
```
### Parameter Description
- `top_p`: Number, controls the diversity of the output. When set to 1, it retains all possibilities.
- `prompt`: String, the user's input prompt, based on which the model generates a response.
- `temperature`: Number, controls the randomness of the output. Higher values produce more diverse but potentially less coherent output.
- `system_prompt`: String, sets the role and behavior of the AI assistant.
- `max_new_tokens`: Integer, specifies the maximum length of the generated text.
- `repetition_penalty`: Number, controls the penalty for repetition. When set to 1, no penalty is applied.
### Notes
- The quality and specificity of the prompt will directly impact the quality and relevance of the generated text.
- The API may take some time to process requests and generate text, consider implementing appropriate wait or loading states.
- Handle possible errors, such as network issues, invalid input, or API limitations.
- Adhere to the terms of use and content policies, especially when dealing with sensitive topics.
### Example Response
The API response will contain the generated text. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement a conversation system by maintaining conversation history to generate coherent multi-turn dialogues.
- Add a user interface that allows users to customize system prompts and other parameters.
- Implement text post-processing features, such as summary generation, keyword extraction, or sentiment analysis.
- Integrate a content filtering system to ensure the generated content complies with usage policies.
- Consider implementing a caching mechanism to improve response times for frequent queries.
---
title: Pulid AI Model Usage Guide
slug: 4IssHB4DIXzEb2LWyU9SY
createdAt: Thu Jul 18 2024 05:48:05 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:45:52 GMT+0000 (Coordinated Universal Time)
---
# AI Model Usage Guide: /predictions/ai/pulid
## Introduction
This document describes how to use the `aonweb` library to call the `/predictions/ai/pulid` AI model. This model is primarily used for image processing and generation tasks, specifically for processing human images.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Basic Usage
### 1. Import Required Modules
```js
import { AI, AIOptions } from 'aonweb';
```
### 2. Initialize AI Instance
```js
const ai_options = new AIOptions({
appId: 'your_app_id_here'
});
const aonet = new AI(ai_options);
```
### 3. Prepare Input Data
```js
const data = {
input: {
"prompt": "",
"cfg_scale": 1.2,
"num_steps": 4,
"image_width": 768,
"num_samples": 1,
"image_height": 1024,
"output_format": "webp",
"identity_scale": 0.8,
"mix_identities": false,
"output_quality": 80,
"generation_mode": "fidelity",
"main_face_image": "your_image_url_here",
"negative_prompt": ""
}
};
```
### 4. Call the AI Model
```js
const price = 8; // Cost of the AI call
try {
const response = await aonet.prediction("/predictions/ai/pulid", data, price);
// Handle response
} catch (error) {
// Error handling
}
```
### Parameter Description
- `prompt`: Text description for generating the image (optional)
- `cfg_scale`: Configuration scale, affecting the match between the generated result and the prompt
- `num_steps`: Number of steps in the generation process
- `image_width`: Width of the output image
- `image_height`: Height of the output image
- `num_samples`: Number of samples generated
- `output_format`: Output image format (e.g., "webp")
- `identity_scale`: Identity scale, affecting the similarity of the generated image to the original image
- `mix_identities`: Whether to mix multiple identities
- `output_quality`: Output image quality (0-100)
- `generation_mode`: Generation mode (e.g., "fidelity")
- `main_face_image`: URL of the main face image
- `negative_prompt`: Negative prompt specifying elements you do not want to appear in the generated image (optional)
### Handling the Response
```js
if (response && response.code == 200 && response.data) {
const result = response.data;
if (result.task.exec_code == 200 && result.task.is_success) {
let url = result.output;
if (Array.isArray(result.output)) {
url = result.output[0];
}
// Use the generated image URL
} else {
// Handle task failure
const errorMessage = result.task.task_error || result.task.api_error || result.message;
// Display error message
}
} else {
// Handle API call failure
}
```
### Error Handling
Ensure to catch and appropriately handle possible errors:
```js
try {
// AI model call code
} catch (error) {
if (typeof error === 'string') {
console.error(error);
} else {
console.error(error.message);
}
// Display error message to the user
}
```
## Best Practices
- Ensure users have uploaded valid images before calling the AI model.
- Consider adding a loading indicator as AI processing may take some time.
- Provide reasonable default values for different parameters to simplify user experience.
- Appropriately handle and display AI-generated results, such as showing them in a new page or modal.
### Notes
- Ensure you have enough balance to cover the AI call costs.
- Consider network bandwidth and processing time when handling large images.
- Adhere to the terms and policies of the AI service provider.
## Conclusion
This AI model offers powerful image processing and generation capabilities. By correctly setting parameters and handling responses, you can implement various interesting image transformation features in your application. If you encounter any issues or need further clarification, please refer to the official documentation or contact the support team.
---
title: SadTalker API Usage Guide
slug: v-jLmWFv0B5P3yrLwoFoA
createdAt: Thu Jul 18 2024 06:07:52 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:42:11 GMT+0000 (Coordinated Universal Time)
---
# SadTalker API Usage Guide
## Introduction
This document will guide developers on how to use the aonet library to call the SadTalker API, which is used to generate AI-driven talking avatars.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Steps to Use
### 1. Import the aonet library
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object that includes your APPID:
```js
const options = {
appid: "your APPID"
};
```
Make sure to replace "your APPID" with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```js
const aonet = new AI(options);
```
### 4. Call the SadTalker API
Use the `prediction` method to call the SadTalker API:
```js
async function generateTalkingAvatar() {
try {
let response = await aonet.prediction("/predictions/ai/sadtalker",
{
input: {
"still": true,
"enhancer": "gfpgan",
"preprocess": "full",
"driven_audio": "https://aonet.ai/pbxt/Jf1gczNATWiC94VPrsTTLuXI0ZmtuZ6k0aWBcQpr7VuRc5f3/japanese.wav",
"source_image": "https://replicate.delivery/pbxt/Jf1gcsODejVsGRd42eeUj0RXX11zjxzHuLuqXmVFwMAi2tZq/art_1.png"
}
});
console.log("SadTalker result:", response);
} catch (error) {
console.error("Error generating talking avatar:", error);
}
}
generateTalkingAvatar();
```
### Parameter Description
- `still`: Boolean, set to true to generate a static image instead of a video.
- `enhancer`: String, specifies the image enhancer to use, here using "gfpgan".
- `preprocess`: String, specifies the preprocessing method, here using "full".
- `driven_audio`: String, the URL of the audio file to drive the animation.
- `source_image`: String, the URL of the source image, which is the avatar to be animated.
### Notes
- Ensure that the provided image and audio URLs are publicly accessible.
- The API may take some time to process the input and generate the result, consider implementing appropriate wait or retry logic.
- Handle possible errors, such as network issues, invalid input, or API limitations.
### Example Response
The API response will contain the URL of the generated talking avatar or other relevant information. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement error retry mechanisms, especially for long-running tasks.
- Add input validation logic to ensure the provided URLs point to valid image and audio files.
- Consider implementing progress tracking, especially for generating video output.
- For production environments, implement rate limiting and caching mechanisms to optimize API usage.
---
title: Stable Diffusion 3 API Usage Guide
slug: NkdKBoGp8nXHnLvwsRNfG
createdAt: Thu Jul 18 2024 06:09:30 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:41:12 GMT+0000 (Coordinated Universal Time)
---
# Stable Diffusion 3 API Usage Guide
## Introduction
This document will guide developers on how to use the aonet library to call the Stable Diffusion 3 API for generating AI art images.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Steps to Use
### 1. Import the aonet library
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object that includes your APPID:
```js
const options = {
appid: "your APPID"
};
```
Make sure to replace "your APPID" with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```js
const aonet = new AI(options);
```
### 4. Call the Stable Diffusion 3 API
Use the `prediction` method to call the Stable Diffusion 3 API:
```js
async function generateImage() {
try {
let response = await aonet.prediction("/predictions/ai/stable-diffusion-3",
{
input: {
"cfg": 3.5,
"prompt": "a photo of vibrant artistic graffiti on a wall saying \"SD3 medium\"",
"aspect_ratio": "3:2",
"output_format": "webp",
"output_quality": 90,
"negative_prompt": ""
}
});
console.log("Stable Diffusion 3 result:", response);
} catch (error) {
console.error("Error generating image:", error);
}
}
generateImage();
```
### Parameter Description
- `cfg`: Number, controls how closely the generated image adheres to the prompt. Higher values make the image more accurate but may reduce creativity.
- `prompt`: String, the text prompt describing the image you want to generate.
- `aspect_ratio`: String, specifies the aspect ratio of the output image, such as "3:2".
- `output_format`: String, specifies the format of the output image, here using "webp".
- `output_quality`: Number, specifies the quality of the output image, ranging from 0 to 100.
- `negative_prompt`: String, describes elements you do not want to appear in the generated image.
### Notes
- The quality and specificity of the prompt will directly affect the quality and accuracy of the generated image.
- The API may take some time to process the request and generate the image, consider implementing appropriate wait or loading states.
- Handle possible errors, such as network issues, invalid input, or API limitations.
- Adhere to the terms of use and copyright laws, especially in commercial use cases.
### Example Response
The API response will contain the URL of the generated image or other relevant information. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement batch image generation by processing multiple prompts in a loop or concurrent requests.
- Add a user interface that allows users to input custom prompts and parameters.
- Implement image post-processing features, such as resizing, cropping, or applying filters.
- Integrate an image storage solution to save and manage the generated images.
---
title: XTTS-V2 API Usage Guide
slug: 6DCWoM_fbbp1XuSHH8vVc
createdAt: Thu Jul 18 2024 06:12:37 GMT+0000 (Coordinated Universal Time)
updatedAt: Thu Jul 18 2024 13:40:04 GMT+0000 (Coordinated Universal Time)
---
# XTTS-V2 API Usage Guide
## Introduction
This document will guide developers on how to use the aonet library to call the XTTS-V2 API, which is used for voice cloning and text-to-speech conversion.
## Prerequisites
- Node.js environment
- `aonweb` library installed
- Valid Aonet APPID
## Installation
Ensure that the aonet library is installed. If it is not installed yet, you can install it using npm:
```bash
npm install aonet
```
## Steps to Use
### 1. Import the aonet library
```js
const AI = require("aonet");
```
### 2. Configure Options
Create an options object that includes your APPID:
```js
const options = {
appid: "your APPID"
};
```
Make sure to replace "your APPID" with your actual Aonet APPID.
### 3. Initialize AI Instance
Initialize the AI instance using the configuration options:
```js
const aonet = new AI(options);
```
### 4. Call the XTTS-V2 API
Use the `prediction` method to call the XTTS-V2 API:
```js
async function generateClonedVoice() {
try {
let response = await aonet.prediction("/predictions/ai/xtts-v2",
{
input: {
"text": "Hi there, I'm your new voice clone. Try your best to upload quality audio",
"speaker": "https://aonet.ai/pbxt/Jt79w0xsT64R1JsiJ0LQRL8UcWspg5J4RFrU6YwEKpOT1ukS/male.wav",
"language": "en",
"cleanup_voice": false
}
});
console.log("XTTS-V2 result:", response);
} catch (error) {
console.error("Error generating cloned voice:", error);
}
}
generateClonedVoice();
```
### Parameter Description
- `text`: String, the text content to be converted into speech.
- `speaker`: String, the URL of the audio file used as the voice sample for cloning.
- `language`: String, specifies the language of the text, with "en" indicating English.
- `cleanup_voice`: Boolean, whether to perform cleanup processing on the generated voice.
### Notes
- Ensure that the provided audio URL is publicly accessible and of good quality to achieve the best cloning effect.
- The API may take some time to process the input and generate the result, consider implementing appropriate wait or loading states.
- Handle possible errors, such as network issues, invalid input, or API limitations.
- Adhere to the terms of use and privacy regulations, especially when handling voice samples of others.
### Example Response
The API response will contain the URL of the generated cloned voice or other relevant information. Parse and use the response data according to the actual API documentation.
## Advanced Usage
- Implement batch text-to-speech conversion by processing multiple text segments in a loop or concurrent requests.
- Add a user interface that allows users to upload their own voice samples and input custom text.
- Implement voice post-processing features, such as adjusting volume, adding background music, or applying audio effects.
- Integrate a voice storage solution to save and manage the generated voice files.
- Consider implementing a voice recognition feature to convert the generated voice back to text for verification or other purposes.
{
"label": "Public Models API",
"position": 2,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
}
}
\ No newline at end of file
---
sidebar_position: 4
title: File and Image Upload
slug: rWvXw__Z48Cd1IrHU-QMl
createdAt: Thu Jul 18 2024 05:46:44 GMT+0000 (Coordinated Universal Time)
updatedAt: Fri Jul 19 2024 03:54:30 GMT+0000 (Coordinated Universal Time)
---
# File and Image Upload
This documentation provides a detailed explanation of how to handle file and image uploads in your application, including file size validation, preparing form data, and interacting with the upload API.
## Function: `onOversize`
### Description
The `onOversize` function checks if the uploaded file exceeds the maximum allowed size and displays a notification if it does.
### Code
```javascript
const onOversize = (file) => {
showToast('文件大小不能超过 30MB');
};
```
### Usage
- **Parameters:**
- `file`: The file object to be checked.
- **Behavior:**
- Displays a toast notification if the file size exceeds 30MB.
## Function: `afterRead`
### Description
The `afterRead` function processes the file after it has been read, prepares it for upload, and calls the `uploadFile` function to handle the actual upload.
### Code
```javascript
function afterRead(file) {
const formData = new FormData();
formData.append('file', file.file);
// Call the upload API
uploadFile(formData).then(res => {
if (res.code == 200 && res.data && res.data.length) {
imageStore.addImage(res.data);
}
}).catch(err => {
showToast('image upload failed');
console.log(err);
});
}
```
### Usage
- **Parameters:**
- `file`: The file object that has been read.
- **Behavior:**
- Creates a `FormData` object and appends the file.
- Calls the `uploadFile` function to upload the file.
- Adds the uploaded image to `imageStore` if the upload is successful.
- Displays an error toast if the upload fails.
## Function: `uploadFile`
### Description
The `uploadFile` function uploads the given file to the server using a POST request and returns the response.
### Code
```javascript
const uploadFile = async (formData) => {
const response = await fetch('https://tmp-file.aigic.ai/api/v1/upload?expires=1800&type=image/png', {
method: 'POST',
body: formData
});
const data = await response.json();
return data;
};
```
### Usage
- **Parameters:**
- `formData`: The `FormData` object containing the file to be uploaded.
- **Behavior:**
- Sends a POST request to the upload API endpoint.
- Returns the response data.
## Workflow
1. **File Selection and Validation:**
- The user selects a file.
- The `onOversize` function checks if the file size exceeds 30MB. If it does, a toast notification is shown.
2. **File Upload:**
- The `afterRead` function is called with the selected file.
- A `FormData` object is created and the file is appended to it.
- The `uploadFile` function is called with the `FormData` object.
3. **API Interaction:**
- The `uploadFile` function sends a POST request to the upload API endpoint.
- If the upload is successful, the response data is added to the `imageStore`.
- If the upload fails, a toast notification is shown and the error is logged.
### Example Usage
```javascript
// Assuming this is part of a Vue component
onMounted(() => {
// File input event listener
document.getElementById('fileInput').addEventListener('change', (event) => {
const file = event.target.files[0];
if (file.size > 30 * 1024 * 1024) {
onOversize(file);
} else {
afterRead({ file });
}
});
});
```
### Notes
- Ensure the API endpoint (`https://tmp-file.aigic.ai/api/v1/upload`) and parameters (`expires` and `type`) are correctly set as per your backend requirements.
- Handle exceptions and errors gracefully to provide a good user experience.
- Customize the toast notifications as needed for your application.
By following this documentation, developers can easily integrate file and image upload functionality into their applications, ensuring proper validation, handling, and error management.
---
sidebar_position: 1
slug: /
---
`aonweb` is a library for AI application development, providing simple and easy-to-use APIs to integrate AI functionality into your applications. This guide will introduce you to AI application development using the aonweb library.
## Installation
First, make sure you have installed the aonweb library:
```text
npm install aonweb --save
```
## Basic Usage
### Import necessary modules
```javascript
import { AI, AIOptions, User } from 'aonweb'
```
### Initialize AI
Use AIOptions to create a configuration, then initialize the AI instance:
```javascript
const ai_options = new AIOptions({
appId: REPLACE_APP_ID //replace app id
})
const aonet = new AI(ai_options)
```
\ No newline at end of file
---
sidebar_position: 3
title: User Account and Points Retrieval
slug: FPVY464Iqc2ZCl1pANyS0
createdAt: Thu Jul 18 2024 05:56:40 GMT+0000 (Coordinated Universal Time)
updatedAt: Fri Jul 19 2024 03:50:33 GMT+0000 (Coordinated Universal Time)
---
# AON Developer Documentation: Get User Account Function
This documentation provides a detailed explanation of the `getAccount` function, which handles user account retrieval, login authentication, and interaction with the Ethereum provider. It includes asynchronous operations to ensure smooth and non-blocking user experience.
## Function: `getAccount`
### Description
The `getAccount` function checks if a user is logged in. If not, it initiates the login process. Once the user is logged in, it retrieves the user's Ethereum account. The function also handles loading indicators and emits events upon successful account retrieval.
### Code
```javascript
async function getAccount() {
try {
// Create a new User instance
let user = new User();
// Check if the user is logged in
const isLogin_status = await user.islogin();
console.log(isLogin_status, 'isLogin_status');
if (!isLogin_status) {
// Show loading indicator if the user is not logged in
showLoadingToast({
duration: 0,
forbidClick: true,
message: 'Loading...',
});
// Initiate login process
user.login((acc, userId, error) => {
// Close loading indicator
closeToast();
console.log("getWeb3 account", acc);
console.log("getWeb3 userId", userId);
console.log("getWeb3 error", error);
// Set the account value and emit event upon successful login
account.value = acc;
bus.emit('get_balance', "login");
});
} else {
// If user is already logged in, interact with Ethereum provider to get account
let ethereum = await detectEthereumProvider();
let get_account = await ethereum.request({ method: 'eth_requestAccounts' });
get_account = get_account[0];
account.value = get_account;
// Emit event after retrieving the account
bus.emit('get_balance', "login");
}
} catch (error) {
// Close loading indicator and handle errors
closeToast();
console.log(error, "getAccount error");
if (error && typeof error == 'string') {
showToast(error);
} else {
showToast(error.message);
}
} finally {
// Final clean-up actions if needed
}
}
```
### Detailed Explanation
1. **User Object Initialization:**
- Creates a new instance of the `User` object.
2. **Check Login Status:**
- Asynchronously checks if the user is logged in.
- Logs the login status.
3. **Handle Not Logged In Status:**
- Shows a loading toast if the user is not logged in.
4. **User Login Process:**
- Initiates the login process.
- Closes the loading toast upon completion.
- Logs the account, userId, and any error.
- Sets the account value and emits an event upon successful login.
5. **Handle Logged In Status:**
- Interacts with the Ethereum provider to get the account if the user is already logged in.
- Sets the account value and emits an event after retrieving the account.
6. **Error Handling:**
- Closes the loading toast in case of an error.
- Logs the error.
- Shows an appropriate error message to the user.
### Usage in Component Lifecycle
The `getAccount` function can be called within the component's lifecycle hooks to ensure that the user's account is retrieved and logged in status is checked when necessary.
### Example Usage
```javascript
onMounted(() => {
getAccount();
});
```
This setup ensures that the user account status is checked and handled as soon as the component is ready, providing a seamless experience for the user.
---
sidebar_position: 2
title: User Login Function
slug: RnOw-user-login-function
createdAt: Fri Jul 19 2024 03:45:40 GMT+0000 (Coordinated Universal Time)
updatedAt: Fri Jul 19 2024 03:56:00 GMT+0000 (Coordinated Universal Time)
---
# AON Developer Documentation: User Login Function
This documentation provides a detailed explanation of the `login` function, which handles user login in an asynchronous manner, including checks for user authentication, showing loading indicators, and emitting events upon successful login.
## Function: `login`
### Description
The `login` function is designed to check if a user is logged in, and if not, it will attempt to log the user in by repeatedly checking for owned users. It uses asynchronous operations to ensure the login process is smooth and non-blocking.
### Code
```javascript
async function login() {
try {
let time = new Date().getTime();
console.log(`demo index login start time = ${time}`);
let user = new User();
let temp = await user.islogin();
console.log(`demo index islogin end time = ${time}, temp = ${temp}`);
if (!temp) {
showLoadingToast({
duration: 0,
forbidClick: true,
message: 'Loading...',
});
console.log(`demo index showLoadingToast end time = ${time}`);
for (let i = 0; i < 5; i++) {
let result = await user.getOwnedUsers();
let userid = result && result._userIds && result._userIds.length && result._userIds[0];
if (userid && userid.length) {
break;
}
await sleep(300);
}
closeToast();
temp = await user.islogin();
if (!temp) {
showToast("login failed, please try again later");
return;
}
}
bus.emit('get_balance', "login");
console.log(`demo index login end time = ${new Date().getTime() - time}`);
} catch (error) {
console.log("index demo error", error);
closeToast();
if (error && typeof error == 'string') {
showToast(error);
} else {
showToast(error.message);
}
} finally {
}
}
onMounted(() => {
getTemplateList();
login();
});
```
### Detailed Explanation
1. **Initialization and Logging Start Time:**
- Records the start time for login.
- Logs the start time to the console.
2. **User Object and Initial Login Check:**
- Creates a new `User` object.
- Checks if the user is already logged in by calling `user.islogin()` and logs the result.
3. **Show Loading Indicator:**
- If the user is not logged in, shows a loading toast to inform the user that a login attempt is in progress.
4. **Repeated Attempts to Get Owned Users:**
- Attempts to get the list of owned users up to 5 times, with a 300ms delay between each attempt.
- Breaks the loop if a valid user ID is found.
5. **Close Toast and Final Login Check:**
- Closes the loading toast.
- Checks the login status again. If still not logged in, shows an error toast and exits the function.
6. **Emit Event and Log End Time:**
- Emits an event to get the balance.
- Logs the end time to the console.
7. **Error Handling:**
- Catches any errors that occur during the login process.
- Logs the error to the console.
- Closes the toast and shows an appropriate error message to the user.
## Usage in Component Lifecycle
The `login` function is called within the `onMounted` lifecycle hook, ensuring that the login process is initiated when the component is mounted.
```javascript
onMounted(() => {
login();
});
```
- `getTemplateList()`: Fetches the list of templates (additional context not provided in the snippet).
- `login()`: Initiates the login process.
This setup ensures that the user login status is checked and handled as soon as the component is ready, providing a seamless experience for the user.
{
"label": "Quikc Start",
"position": 1,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
}
}
\ No newline at end of file
// @ts-check
// Note: type annotations allow type checking and IDEs autocompletion
const lightCodeTheme = require("prism-react-renderer/themes/github");
const darkCodeTheme = require("prism-react-renderer/themes/dracula");
/** @type {import('@docusaurus/types').Config} */
const config = {
title: "AON",
tagline: "AON ai prediction",
url: "https://aigic.ai",
baseUrl: "/",
onBrokenLinks: "throw",
onBrokenMarkdownLinks: "warn",
favicon: "img/favicon.ico",
// GitHub pages deployment config.
// If you aren't using GitHub pages, you don't need these.
organizationName: "aigic", // Usually your GitHub org/user name.
projectName: "docusaurus", // Usually your repo name.
presets: [
[
"classic",
/** @type {import('@docusaurus/preset-classic').Options} */
({
docs: {
sidebarPath: require.resolve("./sidebars.js"),
routeBasePath: "/",
// Please change this to your repo.
// Remove this to remove the "edit this page" links.
// editUrl:
// "https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/",
docLayoutComponent: "@theme/DocPage",
docItemComponent: "@theme/ApiItem", // Derived from docusaurus-theme-openapi
},
blog: {
showReadingTime: true,
// Please change this to your repo.
// Remove this to remove the "edit this page" links.
editUrl:
"https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/",
},
theme: {
customCss: require.resolve("./src/css/custom.css"),
},
}),
],
],
themeConfig:
/** @type {import('@docusaurus/preset-classic').ThemeConfig} */
({
docs: {
sidebar: {
hideable: true,
},
},
colorMode: {
defaultMode: 'dark',
disableSwitch: true,
respectPrefersColorScheme: false
},
navbar: {
// title: "My Site",
logo: {
srcDark: 'img/logo_.44558a825a06f9c7bc29.png',
src: "img/logo.png",
},
items: [
{
href: "https://aigic.ai/playground",
label: "Built with AON",
position: "right",
},
],
},
prism: {
theme: lightCodeTheme,
darkTheme: darkCodeTheme,
additionalLanguages: ["ruby", "csharp", "php"],
},
}),
plugins: [
[
"docusaurus-plugin-openapi-docs",
{
id: "openapi",
docsPluginId: "classic",
config: {
petstore: {
specPath: "examples/petstore.yaml",
outputDir: "docs/petstore",
downloadUrl:
"https://raw.githubusercontent.com/PaloAltoNetworks/docusaurus-template-openapi-docs/main/examples/petstore.yaml",
sidebarOptions: {
groupPathsBy: "tag",
categoryLinkSource: "tag",
},
},
},
},
],
],
themes: ["docusaurus-theme-openapi-docs"],
};
module.exports = config;
This diff is collapsed.
File added
<!doctype html>
<html lang="en" dir="ltr" class="plugin-docs plugin-id-default" data-has-hydrated="false">
<head>
<meta charset="UTF-8">
<meta name="generator" content="Docusaurus v2.4.3">
<title data-rh="true">Page Not Found | AON</title><meta data-rh="true" name="viewport" content="width=device-width,initial-scale=1"><meta data-rh="true" name="twitter:card" content="summary_large_image"><meta data-rh="true" property="og:url" content="https://aigic.ai/404.html"><meta data-rh="true" name="docusaurus_locale" content="en"><meta data-rh="true" name="docusaurus_tag" content="default"><meta data-rh="true" name="docsearch:language" content="en"><meta data-rh="true" name="docsearch:docusaurus_tag" content="default"><meta data-rh="true" property="og:title" content="Page Not Found | AON"><link data-rh="true" rel="icon" href="/img/favicon.ico"><link data-rh="true" rel="canonical" href="https://aigic.ai/404.html"><link data-rh="true" rel="alternate" href="https://aigic.ai/404.html" hreflang="en"><link data-rh="true" rel="alternate" href="https://aigic.ai/404.html" hreflang="x-default"><link rel="alternate" type="application/rss+xml" href="/blog/rss.xml" title="AON RSS Feed">
<link rel="alternate" type="application/atom+xml" href="/blog/atom.xml" title="AON Atom Feed"><link rel="stylesheet" href="/assets/css/styles.70079ed4.css">
<link rel="preload" href="/assets/js/runtime~main.9380babc.js" as="script">
<link rel="preload" href="/assets/js/main.fc02bb69.js" as="script">
</head>
<body class="navigation-with-keyboard">
<script>!function(){function t(t){document.documentElement.setAttribute("data-theme",t)}var e=function(){var t=null;try{t=new URLSearchParams(window.location.search).get("docusaurus-theme")}catch(t){}return t}()||function(){var t=null;try{t=localStorage.getItem("theme")}catch(t){}return t}();t(null!==e?e:"dark")}()</script><div id="__docusaurus">
<div role="region" aria-label="Skip to main content"><a class="skipToContent_fXgn" href="#__docusaurus_skipToContent_fallback">Skip to main content</a></div><nav aria-label="Main" class="navbar navbar--fixed-top"><div class="navbar__inner"><div class="navbar__items"><button aria-label="Toggle navigation bar" aria-expanded="false" class="navbar__toggle clean-btn" type="button"><svg width="30" height="30" viewBox="0 0 30 30" aria-hidden="true"><path stroke="currentColor" stroke-linecap="round" stroke-miterlimit="10" stroke-width="2" d="M4 7h22M4 15h22M4 23h22"></path></svg></button><a class="navbar__brand" href="/"><div class="navbar__logo"><img src="/img/logo.png" alt="AON" class="themedImage_ToTc themedImage--light_HNdA"><img src="/img/logo_.6a67650d0d45161c54eb.png" alt="AON" class="themedImage_ToTc themedImage--dark_i4oU"></div></a></div><div class="navbar__items navbar__items--right"><a href="https://aigic.ai" target="_blank" rel="noopener noreferrer" class="navbar__item navbar__link">Built with AON<svg width="13.5" height="13.5" aria-hidden="true" viewBox="0 0 24 24" class="iconExternalLink_nPIU"><path fill="currentColor" d="M21 13v10h-21v-19h12v2h-10v15h17v-8h2zm3-12h-10.988l4.035 4-6.977 7.07 2.828 2.828 6.977-7.07 4.125 4.172v-11z"></path></svg></a><div class="toggle_vylO colorModeToggle_DEke"><button class="clean-btn toggleButton_gllP toggleButtonDisabled_aARS" type="button" disabled="" title="Switch between dark and light mode (currently dark mode)" aria-label="Switch between dark and light mode (currently dark mode)" aria-live="polite"><svg viewBox="0 0 24 24" width="24" height="24" class="lightToggleIcon_pyhR"><path fill="currentColor" d="M12,9c1.65,0,3,1.35,3,3s-1.35,3-3,3s-3-1.35-3-3S10.35,9,12,9 M12,7c-2.76,0-5,2.24-5,5s2.24,5,5,5s5-2.24,5-5 S14.76,7,12,7L12,7z M2,13l2,0c0.55,0,1-0.45,1-1s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S1.45,13,2,13z M20,13l2,0c0.55,0,1-0.45,1-1 s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S19.45,13,20,13z M11,2v2c0,0.55,0.45,1,1,1s1-0.45,1-1V2c0-0.55-0.45-1-1-1S11,1.45,11,2z M11,20v2c0,0.55,0.45,1,1,1s1-0.45,1-1v-2c0-0.55-0.45-1-1-1C11.45,19,11,19.45,11,20z M5.99,4.58c-0.39-0.39-1.03-0.39-1.41,0 c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0s0.39-1.03,0-1.41L5.99,4.58z M18.36,16.95 c-0.39-0.39-1.03-0.39-1.41,0c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0c0.39-0.39,0.39-1.03,0-1.41 L18.36,16.95z M19.42,5.99c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06c-0.39,0.39-0.39,1.03,0,1.41 s1.03,0.39,1.41,0L19.42,5.99z M7.05,18.36c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06 c-0.39,0.39-0.39,1.03,0,1.41s1.03,0.39,1.41,0L7.05,18.36z"></path></svg><svg viewBox="0 0 24 24" width="24" height="24" class="darkToggleIcon_wfgR"><path fill="currentColor" d="M9.37,5.51C9.19,6.15,9.1,6.82,9.1,7.5c0,4.08,3.32,7.4,7.4,7.4c0.68,0,1.35-0.09,1.99-0.27C17.45,17.19,14.93,19,12,19 c-3.86,0-7-3.14-7-7C5,9.07,6.81,6.55,9.37,5.51z M12,3c-4.97,0-9,4.03-9,9s4.03,9,9,9s9-4.03,9-9c0-0.46-0.04-0.92-0.1-1.36 c-0.98,1.37-2.58,2.26-4.4,2.26c-2.98,0-5.4-2.42-5.4-5.4c0-1.81,0.89-3.42,2.26-4.4C12.92,3.04,12.46,3,12,3L12,3z"></path></svg></button></div><div class="searchBox_ZlJk"></div></div></div><div role="presentation" class="navbar-sidebar__backdrop"></div></nav><div id="__docusaurus_skipToContent_fallback" class="main-wrapper mainWrapper_z2l0"><main class="container margin-vert--xl"><div class="row"><div class="col col--6 col--offset-3"><h1 class="hero__title">Page Not Found</h1><p>We could not find what you were looking for.</p><p>Please contact the owner of the site that linked you to the original URL and let them know their link is broken.</p></div></div></main></div></div>
<script src="/assets/js/runtime~main.9380babc.js"></script>
<script src="/assets/js/main.fc02bb69.js"></script>
</body>
</html>
\ No newline at end of file
This source diff could not be displayed because it is too large. You can view the blob instead.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[4013],{39058:(e,t,a)=>{a.d(t,{Z:()=>k});var l=a(67294),n=a(86010),r=a(95677),s=a(87524),c=a(39960),i=a(95999);const m="sidebar_re4s",o="sidebarItemTitle_pO2u",u="sidebarItemList_Yudw",d="sidebarItem__DBe",p="sidebarItemLink_mo7H",g="sidebarItemLinkActive_I1ZP";function E(e){let{sidebar:t}=e;return l.createElement("aside",{className:"col col--3"},l.createElement("nav",{className:(0,n.default)(m,"thin-scrollbar"),"aria-label":(0,i.I)({id:"theme.blog.sidebar.navAriaLabel",message:"Blog recent posts navigation",description:"The ARIA label for recent posts in the blog sidebar"})},l.createElement("div",{className:(0,n.default)(o,"margin-bottom--md")},t.title),l.createElement("ul",{className:(0,n.default)(u,"clean-list")},t.items.map((e=>l.createElement("li",{key:e.permalink,className:d},l.createElement(c.default,{isNavLink:!0,to:e.permalink,className:p,activeClassName:g},e.title)))))))}var b=a(13102);function f(e){let{sidebar:t}=e;return l.createElement("ul",{className:"menu__list"},t.items.map((e=>l.createElement("li",{key:e.permalink,className:"menu__list-item"},l.createElement(c.default,{isNavLink:!0,to:e.permalink,className:"menu__link",activeClassName:"menu__link--active"},e.title)))))}function _(e){return l.createElement(b.Zo,{component:f,props:e})}function h(e){let{sidebar:t}=e;const a=(0,s.i)();return t?.items.length?"mobile"===a?l.createElement(_,{sidebar:t}):l.createElement(E,{sidebar:t}):null}function k(e){const{sidebar:t,toc:a,children:s,...c}=e,i=t&&t.items.length>0;return l.createElement(r.Z,c,l.createElement("div",{className:"container margin-vert--lg"},l.createElement("div",{className:"row"},l.createElement(h,{sidebar:t}),l.createElement("main",{className:(0,n.default)("col",{"col--7":i,"col--9 col--offset-1":!i}),itemScope:!0,itemType:"http://schema.org/Blog"},s),a&&l.createElement("div",{className:"col col--2"},a))))}},20472:(e,t,a)=>{a.r(t),a.d(t,{default:()=>g});var l=a(67294),n=a(86010),r=a(35155),s=a(10833),c=a(35281),i=a(39058),m=a(13008);const o="tag_Nnez";function u(e){let{letterEntry:t}=e;return l.createElement("article",null,l.createElement("h2",null,t.letter),l.createElement("ul",{className:"padding--none"},t.tags.map((e=>l.createElement("li",{key:e.permalink,className:o},l.createElement(m.Z,e))))),l.createElement("hr",null))}function d(e){let{tags:t}=e;const a=(0,r.P)(t);return l.createElement("section",{className:"margin-vert--lg"},a.map((e=>l.createElement(u,{key:e.letter,letterEntry:e}))))}var p=a(90197);function g(e){let{tags:t,sidebar:a}=e;const m=(0,r.M)();return l.createElement(s.FG,{className:(0,n.default)(c.k.wrapper.blogPages,c.k.page.blogTagsListPage)},l.createElement(s.d,{title:m}),l.createElement(p.Z,{tag:"blog_tags_list"}),l.createElement(i.Z,{sidebar:a},l.createElement("h1",null,m),l.createElement(d,{tags:t})))}},13008:(e,t,a)=>{a.d(t,{Z:()=>m});var l=a(67294),n=a(86010),r=a(39960);const s="tag_zVej",c="tagRegular_sFm0",i="tagWithCount_h2kH";function m(e){let{permalink:t,label:a,count:m}=e;return l.createElement(r.default,{href:t,className:(0,n.default)(s,m?i:c)},a,m&&l.createElement("span",null,m))}},35155:(e,t,a)=>{a.d(t,{M:()=>n,P:()=>r});var l=a(95999);const n=()=>(0,l.I)({id:"theme.tags.tagsPageTitle",message:"Tags",description:"The title of the tag list page"});function r(e){const t={};return Object.values(e).forEach((e=>{const a=function(e){return e[0].toUpperCase()}(e.label);t[a]??=[],t[a].push(e)})),Object.entries(t).sort(((e,t)=>{let[a]=e,[l]=t;return a.localeCompare(l)})).map((e=>{let[t,a]=e;return{letter:t,tags:a.sort(((e,t)=>e.label.localeCompare(t.label)))}}))}}}]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[1633],{62511:a=>{a.exports=JSON.parse('{"label":"facebook","permalink":"/blog/tags/facebook","allTagsPath":"/blog/tags","count":1}')}}]);
\ No newline at end of file
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[7178],{85010:e=>{e.exports=JSON.parse('{"permalink":"/blog/tags/facebook","page":1,"postsPerPage":10,"totalPages":1,"totalCount":1,"blogDescription":"Blog","blogTitle":"Blog"}')}}]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[9823],{24469:e=>{e.exports=JSON.parse('{"name":"docusaurus-plugin-content-blog","id":"default"}')}}]);
\ No newline at end of file
"use strict"; (self.webpackChunkdocusaurus_template_openapi_docs = self.webpackChunkdocusaurus_template_openapi_docs || []).push([[9671], { 59881: (e, t, o) => { o.r(t), o.d(t, { assets: () => u, contentTitle: () => n, default: () => c, frontMatter: () => a, metadata: () => l, toc: () => r }); var s = o(87462), i = (o(67294), o(3905)); const a = { sidebar_position: 1, slug: "/" }, n = "Introduction", l = { unversionedId: "intro", id: "intro", title: "Introduction", description: "Realize Countless Imaginations Just One Way", source: "@site/docs/intro.md", sourceDirName: ".", slug: "/", permalink: "/", draft: !1, tags: [], version: "current", sidebarPosition: 1, frontMatter: { sidebar_position: 1, slug: "/" }, sidebar: "tutorialSidebar", next: { title: "Quick Start", permalink: "/category/quick-start" } }, u = {}, r = [{ value: "Realize Countless Imaginations Just One Way", id: "realize-countless-imaginations-just-one-way", level: 2 }, { value: "With AON You Can", id: "with-aigic-you-can", level: 2 }, { value: "Always Use Ai Open-source Models", id: "always-use-ai-open-source-models", level: 3 }, { value: "Fine Tune The Model", id: "fine-tune-the-model", level: 3 }, { value: "Deploy Custom Ai Models", id: "deploy-custom-ai-models", level: 3 }, { value: "Why Choose To Use AON", id: "why-choose-to-use-aigic", level: 2 }, { value: "The Lowest Usage Price", id: "the-lowest-usage-price", level: 3 }, { value: "Full Encryption To Protect Privacy", id: "full-encryption-to-protect-privacy", level: 3 }], d = { toc: r }; function c(e) { let { components: t, ...o } = e; return (0, i.kt)("wrapper", (0, s.Z)({}, d, o, { components: t, mdxType: "MDXLayout" }), (0, i.kt)("h1", { id: "introduction" }, "Introduction"), (0, i.kt)("h2", { id: "realize-countless-imaginations-just-one-way" }, "Realize Countless Imaginations Just One Way"), (0, i.kt)("p", null, "No hardware required, with AON, you can run and adjust various types of artificial intelligence models in minutes, and copy the code into your own project."), (0, i.kt)("h2", { id: "with-aigic-you-can" }, "With AON You Can"), (0, i.kt)("h3", { id: "always-use-ai-open-source-models" }, "Always Use Ai Open-source Models"), (0, i.kt)("p", null, "You don't need expensive GPUs and hardware, you can run over 1000 open-source models on AON at any time and copy the code into your own project."), (0, i.kt)("h3", { id: "fine-tune-the-model" }, "Fine Tune The Model"), (0, i.kt)("p", null, "You can use the parameters you need to adjust the model and create a model that better suits your needs, such as an image model with a specific style or a language model that completes specific tasks."), (0, i.kt)("h3", { id: "deploy-custom-ai-models" }, "Deploy Custom Ai Models"), (0, i.kt)("p", null, "In addition to the models in the existing model library, you can also deploy your own custom models, which are stored on cloud servers and you only need to pay fees when using them."), (0, i.kt)("h2", { id: "why-choose-to-use-aigic" }, "Why Choose To Use AON"), (0, i.kt)("h3", { id: "the-lowest-usage-price" }, "The Lowest Usage Price"), (0, i.kt)("p", null, "AON utilizes the most advanced aggregated AI comput-ing network, which not only provides users with a fast AI model experience but also greatly reduces their usage costs."), (0, i.kt)("h3", { id: "full-encryption-to-protect-privacy" }, "Full Encryption To Protect Privacy"), (0, i.kt)("p", null, "AON uses advanced models and data transmission encryption technology to protect user model data and copyright, as well as user privacy and trade secrets.")) } c.isMDXComponent = !0 } }]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[736],{28016:t=>{t.exports=JSON.parse('{"title":"Quick Start","description":"5 minutes to learn the most important Docusaurus concepts.","slug":"/category/quick-start","permalink":"/category/quick-start","navigation":{"previous":{"title":"Introduction","permalink":"/"},"next":{"title":"Get Started","permalink":"/quick-start/get_started"}}}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[9817],{1310:(e,t,a)=>{a.r(t),a.d(t,{default:()=>E});var n=a(87462),r=a(67294),l=a(86010),i=a(35281),s=a(53438),c=a(48596),o=a(39960),m=a(95999),d=a(44996);function u(e){return r.createElement("svg",(0,n.Z)({viewBox:"0 0 24 24"},e),r.createElement("path",{d:"M10 19v-5h4v5c0 .55.45 1 1 1h3c.55 0 1-.45 1-1v-7h1.7c.46 0 .68-.57.33-.87L12.67 3.6c-.38-.34-.96-.34-1.34 0l-8.36 7.53c-.34.3-.13.87.33.87H5v7c0 .55.45 1 1 1h3c.55 0 1-.45 1-1z",fill:"currentColor"}))}const b="breadcrumbHomeIcon_YNFT";function h(){const e=(0,d.Z)("/");return r.createElement("li",{className:"breadcrumbs__item"},r.createElement(o.default,{"aria-label":(0,m.I)({id:"theme.docs.breadcrumbs.home",message:"Home page",description:"The ARIA label for the home page in the breadcrumbs"}),className:"breadcrumbs__link",href:e},r.createElement(u,{className:b})))}const p="breadcrumbsContainer_Z_bl";function f(e){let{children:t,href:a,isLast:n}=e;const l="breadcrumbs__link";return n?r.createElement("span",{className:l,itemProp:"name"},t):a?r.createElement(o.default,{className:l,href:a,itemProp:"item"},r.createElement("span",{itemProp:"name"},t)):r.createElement("span",{className:l},t)}function v(e){let{children:t,active:a,index:i,addMicrodata:s}=e;return r.createElement("li",(0,n.Z)({},s&&{itemScope:!0,itemProp:"itemListElement",itemType:"https://schema.org/ListItem"},{className:(0,l.default)("breadcrumbs__item",{"breadcrumbs__item--active":a})}),t,r.createElement("meta",{itemProp:"position",content:String(i+1)}))}function E(){const e=(0,s.s1)(),t=(0,c.Ns)();return e?r.createElement("nav",{className:(0,l.default)(i.k.docs.docBreadcrumbs,p),"aria-label":(0,m.I)({id:"theme.docs.breadcrumbs.navAriaLabel",message:"Breadcrumbs",description:"The ARIA label for the breadcrumbs"})},r.createElement("ul",{className:"breadcrumbs",itemScope:!0,itemType:"https://schema.org/BreadcrumbList"},t&&r.createElement(h,null),e.map(((t,a)=>{const n=a===e.length-1;return r.createElement(v,{key:a,active:n,index:a,addMicrodata:!!t.href},r.createElement(f,{href:t.href,isLast:n},t.label))})))):null}},34228:(e,t,a)=>{a.r(t),a.d(t,{default:()=>A});var n=a(67294),r=a(10833),l=a(53438),i=a(44996),s=a(86010),c=a(39960),o=a(13919),m=a(95999);const d="cardContainer_fWXF",u="cardTitle_rnsV",b="cardDescription_PWke";function h(e){let{href:t,children:a}=e;return n.createElement(c.default,{href:t,className:(0,s.default)("card padding--lg",d)},a)}function p(e){let{href:t,icon:a,title:r,description:l}=e;return n.createElement(h,{href:t},n.createElement("h2",{className:(0,s.default)("text--truncate",u),title:r},a," ",r),l&&n.createElement("p",{className:(0,s.default)("text--truncate",b),title:l},l))}function f(e){let{item:t}=e;const a=(0,l.Wl)(t);return a?n.createElement(p,{href:a,icon:"\ud83d\uddc3\ufe0f",title:t.label,description:t.description??(0,m.I)({message:"{count} items",id:"theme.docs.DocCard.categoryDescription",description:"The default description for a category card in the generated index about how many items this category includes"},{count:t.items.length})}):null}function v(e){let{item:t}=e;const a=(0,o.Z)(t.href)?"\ud83d\udcc4\ufe0f":"\ud83d\udd17",r=(0,l.xz)(t.docId??void 0);return n.createElement(p,{href:t.href,icon:a,title:t.label,description:t.description??r?.description})}function E(e){let{item:t}=e;switch(t.type){case"link":return n.createElement(v,{item:t});case"category":return n.createElement(f,{item:t});default:throw new Error(`unknown item type ${JSON.stringify(t)}`)}}function g(e){let{className:t}=e;const a=(0,l.jA)();return n.createElement(N,{items:a.items,className:t})}function N(e){const{items:t,className:a}=e;if(!t)return n.createElement(g,e);const r=(0,l.MN)(t);return n.createElement("section",{className:(0,s.default)("row",a)},r.map(((e,t)=>n.createElement("article",{key:t,className:"col col--6 margin-bottom--lg"},n.createElement(E,{item:e})))))}var _=a(80049),k=a(23120),L=a(44364),T=a(1310),Z=a(92503);const x="generatedIndexPage_vN6x",y="list_eTzJ",I="title_kItE";function w(e){let{categoryGeneratedIndex:t}=e;return n.createElement(r.d,{title:t.title,description:t.description,keywords:t.keywords,image:(0,i.Z)(t.image)})}function V(e){let{categoryGeneratedIndex:t}=e;const a=(0,l.jA)();return n.createElement("div",{className:x},n.createElement(k.default,null),n.createElement(T.default,null),n.createElement(L.default,null),n.createElement("header",null,n.createElement(Z.Z,{as:"h1",className:I},t.title),t.description&&n.createElement("p",null,t.description)),n.createElement("article",{className:"margin-top--lg"},n.createElement(N,{items:a.items,className:y})),n.createElement("footer",{className:"margin-top--lg"},n.createElement(_.Z,{previous:t.navigation.previous,next:t.navigation.next})))}function A(e){return n.createElement(n.Fragment,null,n.createElement(w,e),n.createElement(V,e))}},80049:(e,t,a)=>{a.d(t,{Z:()=>s});var n=a(87462),r=a(67294),l=a(95999),i=a(32244);function s(e){const{previous:t,next:a}=e;return r.createElement("nav",{className:"pagination-nav docusaurus-mt-lg","aria-label":(0,l.I)({id:"theme.docs.paginator.navAriaLabel",message:"Docs pages",description:"The ARIA label for the docs pagination"})},t&&r.createElement(i.Z,(0,n.Z)({},t,{subLabel:r.createElement(l.Z,{id:"theme.docs.paginator.previous",description:"The label used to navigate to the previous doc"},"Previous")})),a&&r.createElement(i.Z,(0,n.Z)({},a,{subLabel:r.createElement(l.Z,{id:"theme.docs.paginator.next",description:"The label used to navigate to the next doc"},"Next"),isNext:!0})))}},44364:(e,t,a)=>{a.r(t),a.d(t,{default:()=>c});var n=a(67294),r=a(86010),l=a(95999),i=a(35281),s=a(74477);function c(e){let{className:t}=e;const a=(0,s.E)();return a.badge?n.createElement("span",{className:(0,r.default)(t,i.k.docs.docVersionBadge,"badge badge--secondary")},n.createElement(l.Z,{id:"theme.docs.versionBadge.label",values:{versionLabel:a.label}},"Version: {versionLabel}")):null}},23120:(e,t,a)=>{a.r(t),a.d(t,{default:()=>f});var n=a(67294),r=a(86010),l=a(52263),i=a(39960),s=a(95999),c=a(80143),o=a(35281),m=a(60373),d=a(74477);const u={unreleased:function(e){let{siteTitle:t,versionMetadata:a}=e;return n.createElement(s.Z,{id:"theme.docs.versions.unreleasedVersionLabel",description:"The label used to tell the user that he's browsing an unreleased doc version",values:{siteTitle:t,versionLabel:n.createElement("b",null,a.label)}},"This is unreleased documentation for {siteTitle} {versionLabel} version.")},unmaintained:function(e){let{siteTitle:t,versionMetadata:a}=e;return n.createElement(s.Z,{id:"theme.docs.versions.unmaintainedVersionLabel",description:"The label used to tell the user that he's browsing an unmaintained doc version",values:{siteTitle:t,versionLabel:n.createElement("b",null,a.label)}},"This is documentation for {siteTitle} {versionLabel}, which is no longer actively maintained.")}};function b(e){const t=u[e.versionMetadata.banner];return n.createElement(t,e)}function h(e){let{versionLabel:t,to:a,onClick:r}=e;return n.createElement(s.Z,{id:"theme.docs.versions.latestVersionSuggestionLabel",description:"The label used to tell the user to check the latest version",values:{versionLabel:t,latestVersionLink:n.createElement("b",null,n.createElement(i.default,{to:a,onClick:r},n.createElement(s.Z,{id:"theme.docs.versions.latestVersionLinkLabel",description:"The label used for the latest version suggestion link label"},"latest version")))}},"For up-to-date documentation, see the {latestVersionLink} ({versionLabel}).")}function p(e){let{className:t,versionMetadata:a}=e;const{siteConfig:{title:i}}=(0,l.default)(),{pluginId:s}=(0,c.gA)({failfast:!0}),{savePreferredVersionName:d}=(0,m.J)(s),{latestDocSuggestion:u,latestVersionSuggestion:p}=(0,c.Jo)(s),f=u??(v=p).docs.find((e=>e.id===v.mainDocId));var v;return n.createElement("div",{className:(0,r.default)(t,o.k.docs.docVersionBanner,"alert alert--warning margin-bottom--md"),role:"alert"},n.createElement("div",null,n.createElement(b,{siteTitle:i,versionMetadata:a})),n.createElement("div",{className:"margin-top--md"},n.createElement(h,{versionLabel:p.label,to:f.path,onClick:()=>d(p.name)})))}function f(e){let{className:t}=e;const a=(0,d.E)();return a.banner?n.createElement(p,{className:t,versionMetadata:a}):null}},32244:(e,t,a)=>{a.d(t,{Z:()=>i});var n=a(67294),r=a(86010),l=a(39960);function i(e){const{permalink:t,title:a,subLabel:i,isNext:s}=e;return n.createElement(l.default,{className:(0,r.default)("pagination-nav__link",s?"pagination-nav__link--next":"pagination-nav__link--prev"),to:t},i&&n.createElement("div",{className:"pagination-nav__sublabel"},i),n.createElement("div",{className:"pagination-nav__label"},a))}}}]);
\ No newline at end of file
"use strict"; (self.webpackChunkdocusaurus_template_openapi_docs = self.webpackChunkdocusaurus_template_openapi_docs || []).push([[6303], { 70192: e => { e.exports = JSON.parse('{"title":"Public Models API","description":"5 minutes to learn the most important Docusaurus concepts.","slug":"/category/public-models-api","permalink":"/category/public-models-api","navigation":{"previous":{"title":"Using AON with HTTP","permalink":"/quick-start/http"},"next":{"title":"Prediction","permalink":"/publicModelsAPI/all-mpnet-base-v2/predictions-post"}}}') } }]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[3568],{83769:e=>{e.exports=JSON.parse('{"name":"docusaurus-plugin-content-docs","id":"default"}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[453],{98605:e=>{e.exports=JSON.parse('{"label":"hello","permalink":"/blog/tags/hello","allTagsPath":"/blog/tags","count":2}')}}]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[7414],{53123:(e,t,a)=>{a.r(t),a.d(t,{contentTitle:()=>r,default:()=>m,frontMatter:()=>n,metadata:()=>d,toc:()=>s});var o=a(87462),p=(a(67294),a(3905));const n={title:"Markdown page example"},r="Markdown page example",d={type:"mdx",permalink:"/markdown-page",source:"@site/src/pages/markdown-page.md",title:"Markdown page example",description:"You don't need React to write simple standalone pages.",frontMatter:{title:"Markdown page example"}},s=[],l={toc:s};function m(e){let{components:t,...a}=e;return(0,p.kt)("wrapper",(0,o.Z)({},l,a,{components:t,mdxType:"MDXLayout"}),(0,p.kt)("h1",{id:"markdown-page-example"},"Markdown page example"),(0,p.kt)("p",null,"You don't need React to write simple standalone pages."))}m.isMDXComponent=!0}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
"use strict"; (self.webpackChunkdocusaurus_template_openapi_docs = self.webpackChunkdocusaurus_template_openapi_docs || []).push([[9333], { 52855: (t, e, a) => { a.r(e), a.d(e, { assets: () => d, contentTitle: () => r, default: () => k, frontMatter: () => i, metadata: () => s, toc: () => u }); var n = a(87462), o = (a(67294), a(3905)); const i = { id: "get_started", sidebar_position: 0 }, r = "Get Started", s = { unversionedId: "quick-start/get_started", id: "quick-start/get_started", title: "Get Started", description: "AON makes it easy to run machine learning models in the cloud from your own code.", source: "@site/docs/quick-start/get_started.md", sourceDirName: "quick-start", slug: "/quick-start/get_started", permalink: "/quick-start/get_started", draft: !1, tags: [], version: "current", sidebarPosition: 0, frontMatter: { id: "get_started", sidebar_position: 0 }, sidebar: "tutorialSidebar", previous: { title: "Quick Start", permalink: "/category/quick-start" }, next: { title: "Using AON with Node.js", permalink: "/quick-start/javascript_sdk" } }, d = {}, u = [{ value: "Get API Key And JWT Token", id: "get-api-key-and-jwt-token", level: 2 }, { value: "Usage", id: "usage", level: 2 }], l = { toc: u }; function k(t) { let { components: e, ...a } = t; return (0, o.kt)("wrapper", (0, n.Z)({}, l, a, { components: e, mdxType: "MDXLayout" }), (0, o.kt)("h1", { id: "get-started" }, "Get Started"), (0, o.kt)("p", null, "AON makes it easy to run machine learning models in the cloud from your own code."), (0, o.kt)("h2", { id: "get-api-key-and-jwt-token" }, "Get API Key And JWT Token"), (0, o.kt)("p", null, "Before using AON, we need to obtain an API Key or JWT Token for authentication in future API calls."), (0, o.kt)("p", null, "1\uff0cLogin from ", (0, o.kt)("a", { parentName: "p", href: "https://aigic.ai/" }, "AON home site"), " or ", (0, o.kt)("a", { parentName: "p", href: "https://console.aigic.ai/" }, "Dashboard")), (0, o.kt)("p", null, "2, If you are logged in from the AON home site, please navigate to the Dashboard."), (0, o.kt)("p", null, "3\uff0cYou can find the API Key from the list of API Keys."), (0, o.kt)("p", null, "4\uff0cYou can find the JWT Token from the list of JWT Tokens."), (0, o.kt)("p", null, "5\uff0cYou can choose either API key or JWT token for authentication."), (0, o.kt)("h2", { id: "usage" }, "Usage"), (0, o.kt)("p", null, (0, o.kt)("a", { parentName: "p", href: "javascript_sdk" }, "Using AON with Node.js")), (0, o.kt)("p", null, (0, o.kt)("a", { parentName: "p", href: "python_sdk" }, "Using AON with Python")), (0, o.kt)("p", null, (0, o.kt)("a", { parentName: "p", href: "http" }, "Using AON with HTTP"))) } k.isMDXComponent = !0 } }]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
/*!
Copyright (c) 2015 Jed Watson.
Based on code that is Copyright 2013-2015, Facebook, Inc.
All rights reserved.
*/
/*!
* Adapted from jQuery UI core
*
* http://jqueryui.com
*
* Copyright 2014 jQuery Foundation and other contributors
* Released under the MIT license.
* http://jquery.org/license
*
* http://api.jqueryui.com/category/ui-core/
*/
/*!
* The buffer module from node.js, for the browser.
*
* @author Feross Aboukhadijeh <https://feross.org>
* @license MIT
*/
/*!
* mime-db
* Copyright(c) 2014 Jonathan Ong
* MIT Licensed
*/
/*!
* mime-types
* Copyright(c) 2014 Jonathan Ong
* Copyright(c) 2015 Douglas Christopher Wilson
* MIT Licensed
*/
/*! Bundled license information:
prismjs/prism.js:
(**
* Prism: Lightweight, robust, elegant syntax highlighting
*
* @license MIT <https://opensource.org/licenses/MIT>
* @author Lea Verou <https://lea.verou.me>
* @namespace
* @public
*)
*/
/*! https://mths.be/punycode v1.3.2 by @mathias */
/*! ieee754. BSD-3-Clause License. Feross Aboukhadijeh <https://feross.org/opensource> */
/*! safe-buffer. MIT License. Feross Aboukhadijeh <https://feross.org/opensource> */
/**
* @license
* Lodash <https://lodash.com/>
* Copyright OpenJS Foundation and other contributors <https://openjsf.org/>
* Released under MIT license <https://lodash.com/license>
* Based on Underscore.js 1.8.3 <http://underscorejs.org/LICENSE>
* Copyright Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors
*/
/**
* @license React
* react-jsx-runtime.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @preserve
* Counter block mode compatible with Dr Brian Gladman fileenc.c
* derived from CryptoJS.mode.CTR
* Jan Hruby jhruby.web@gmail.com
*/
/** @preserve
(c) 2012 by Cédric Mesnil. All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/**!
* @license http://www.apache.org/licenses/LICENSE-2.0
*
* Copyright 2015 Postdot Technologies Pvt. Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
* an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and limitations under the License.
*/
/**!
* Originally written by:
* https://github.com/sindresorhus/parse-json
*/
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[4972],{4972:(e,t,a)=>{a.r(t),a.d(t,{default:()=>c});var n=a(67294),o=a(95999),l=a(10833),r=a(95677);function c(){return n.createElement(n.Fragment,null,n.createElement(l.d,{title:(0,o.I)({id:"theme.NotFound.title",message:"Page Not Found"})}),n.createElement(r.Z,null,n.createElement("main",{className:"container margin-vert--xl"},n.createElement("div",{className:"row"},n.createElement("div",{className:"col col--6 col--offset-3"},n.createElement("h1",{className:"hero__title"},n.createElement(o.Z,{id:"theme.NotFound.title",description:"The title of the 404 page"},"Page Not Found")),n.createElement("p",null,n.createElement(o.Z,{id:"theme.NotFound.p1",description:"The first paragraph of the 404 page"},"We could not find what you were looking for.")),n.createElement("p",null,n.createElement(o.Z,{id:"theme.NotFound.p2",description:"The 2nd paragraph of the 404 page"},"Please contact the owner of the site that linked you to the original URL and let them know their link is broken.")))))))}}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[1893],{69604:()=>{},91998:()=>{},44616:()=>{},42480:()=>{},69862:()=>{},40964:()=>{}}]);
\ No newline at end of file
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[9035],{50499:e=>{e.exports=JSON.parse('{"permalink":"/blog/tags/hola","page":1,"postsPerPage":10,"totalPages":1,"totalCount":1,"blogDescription":"Blog","blogTitle":"Blog"}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[2267],{88642:(t,e,o)=>{o.r(e),o.d(e,{assets:()=>n,contentTitle:()=>u,default:()=>i,frontMatter:()=>r,metadata:()=>l,toc:()=>p});var s=o(87462),a=(o(67294),o(3905));const r={slug:"mdx-blog-post",title:"MDX Blog Post",authors:["slorber"],tags:["docusaurus"]},u=void 0,l={permalink:"/blog/mdx-blog-post",editUrl:"https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/blog/2021-08-01-mdx-blog-post.mdx",source:"@site/blog/2021-08-01-mdx-blog-post.mdx",title:"MDX Blog Post",description:"Blog posts support Docusaurus Markdown features, such as MDX.",date:"2021-08-01T00:00:00.000Z",formattedDate:"August 1, 2021",tags:[{label:"docusaurus",permalink:"/blog/tags/docusaurus"}],readingTime:.175,hasTruncateMarker:!1,authors:[{name:"S\xe9bastien Lorber",title:"Docusaurus maintainer",url:"https://sebastienlorber.com",imageURL:"https://github.com/slorber.png",key:"slorber"}],frontMatter:{slug:"mdx-blog-post",title:"MDX Blog Post",authors:["slorber"],tags:["docusaurus"]},prevItem:{title:"Welcome",permalink:"/blog/welcome"},nextItem:{title:"Long Blog Post",permalink:"/blog/long-blog-post"}},n={authorsImageUrls:[void 0]},p=[],c={toc:p};function i(t){let{components:e,...o}=t;return(0,a.kt)("wrapper",(0,s.Z)({},c,o,{components:e,mdxType:"MDXLayout"}),(0,a.kt)("p",null,"Blog posts support ",(0,a.kt)("a",{parentName:"p",href:"https://docusaurus.io/docs/markdown-features"},"Docusaurus Markdown features"),", such as ",(0,a.kt)("a",{parentName:"p",href:"https://mdxjs.com/"},"MDX"),"."),(0,a.kt)("admonition",{type:"tip"},(0,a.kt)("p",{parentName:"admonition"},"Use the power of React to create interactive blog posts."),(0,a.kt)("pre",{parentName:"admonition"},(0,a.kt)("code",{parentName:"pre",className:"language-js"},"<button onClick={() => alert('button clicked!')}>Click me!</button>\n")),(0,a.kt)("button",{onClick:()=>alert("button clicked!")},"Click me!")))}i.isMDXComponent=!0}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[6938],{84545:e=>{e.exports=JSON.parse('{"permalink":"/blog/tags/docusaurus","page":1,"postsPerPage":10,"totalPages":1,"totalCount":4,"blogDescription":"Blog","blogTitle":"Blog"}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[110],{70711:e=>{e.exports=JSON.parse('{"permalink":"/blog/tags/hello","page":1,"postsPerPage":10,"totalPages":1,"totalCount":2,"blogDescription":"Blog","blogTitle":"Blog"}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict"; (self.webpackChunkdocusaurus_template_openapi_docs = self.webpackChunkdocusaurus_template_openapi_docs || []).push([[4043], { 15863: (t, e, a) => { a.r(e), a.d(e, { assets: () => l, contentTitle: () => s, default: () => c, frontMatter: () => r, metadata: () => o, toc: () => p }); var n = a(87462), i = (a(67294), a(3905)); const r = { id: "javascript_sdk", sidebar_position: 1 }, s = "Using AON with Node.js", o = { unversionedId: "quick-start/javascript_sdk", id: "quick-start/javascript_sdk", title: "Using AON with Node.js", description: "AGIN - JavaScript sdk for AON", source: "@site/docs/quick-start/javascript_sdk.md", sourceDirName: "quick-start", slug: "/quick-start/javascript_sdk", permalink: "/quick-start/javascript_sdk", draft: !1, tags: [], version: "current", sidebarPosition: 1, frontMatter: { id: "javascript_sdk", sidebar_position: 1 }, sidebar: "tutorialSidebar", previous: { title: "Get Started", permalink: "/quick-start/get_started" }, next: { title: "Using AON with Python", permalink: "/quick-start/python_sdk" } }, l = {}, p = [{ value: "Installation", id: "installation", level: 2 }, { value: "For Node.js", id: "for-nodejs", level: 3 }, { value: "npm", id: "npm", level: 4 }, { value: "Getting Started", id: "getting-started", level: 2 }, { value: "API", id: "api", level: 2 }, { value: "Constructor", id: "constructor", level: 3 }, { value: "Docs", id: "docs", level: 2 }], d = { toc: p }; function c(t) { let { components: e, ...a } = t; return (0, i.kt)("wrapper", (0, n.Z)({}, d, a, { components: e, mdxType: "MDXLayout" }), (0, i.kt)("h1", { id: "using-aigic-with-nodejs" }, "Using AON with Node.js"), (0, i.kt)("p", null, "AGIN - JavaScript sdk for AON\nA Node.js client for AON. It lets you run models from your Node.js code, and everything else you can do with HTTP API."), (0, i.kt)("p", null, (0, i.kt)("strong", { parentName: "p" }, "[!IMPORTANT]"), " This library can't interact with AGIN API directly from a browser. "), (0, i.kt)("ul", null, (0, i.kt)("li", { parentName: "ul" }, "Node.js version: 16.20.0"), (0, i.kt)("li", { parentName: "ul" }, "API version: 1.0.0"), (0, i.kt)("li", { parentName: "ul" }, "Package version: 1.0.0")), (0, i.kt)("h2", { id: "installation" }, "Installation"), (0, i.kt)("h3", { id: "for-nodejs" }, "For ", (0, i.kt)("a", { parentName: "h3", href: "https://nodejs.org/" }, "Node.js")), (0, i.kt)("h4", { id: "npm" }, "npm"), (0, i.kt)("p", null, "Install it from npm:"), (0, i.kt)("pre", null, (0, i.kt)("code", { parentName: "pre", className: "language-shell" }, "npm install agin --save\n")), (0, i.kt)("h2", { id: "getting-started" }, "Getting Started"), (0, i.kt)("p", null, "Please follow the ", (0, i.kt)("a", { parentName: "p", href: "#installation" }, "installation"), " instruction and execute the following JS code:"), (0, i.kt)("pre", null, (0, i.kt)("code", { parentName: "pre", className: "language-javascript" }, 'const AGIN = require("agin");\n\nconst options = {\n //get your api key or jwt token from https://console.aigic.ai\n auth:"my api key or jwt token"\n}\nconst agin = new AGIN(options);\nlet response = await agin.prediction("/predictions/ai/ssd-1b",\n {\n input:{\n "seed": 36446545872,\n "width": 768,\n "height": 768,\n "prompt": "with smoke, half ice and half fire and ultra realistic in detail.wolf, typography, dark fantasy, wildlife photography, vibrant, cinematic and on a black background",\n "scheduler": "K_EULER",\n "num_outputs": 1,\n "guidance_scale": 9,\n "negative_prompt": "scary, cartoon, painting",\n "num_inference_steps": 25\n }\n });\n console.log("test",response);\n\n')), (0, i.kt)("h2", { id: "api" }, "API"), (0, i.kt)("p", null, "All URIs are default relative to ", (0, i.kt)("em", { parentName: "p" }, (0, i.kt)("a", { parentName: "em", href: "https://api.aigic.ai" }, "https://api.aigic.ai"))), (0, i.kt)("h3", { id: "constructor" }, "Constructor"), (0, i.kt)("pre", null, (0, i.kt)("code", { parentName: "pre", className: "language-javascript" }, "const agin = new AGIN(options);\n")), (0, i.kt)("table", null, (0, i.kt)("thead", { parentName: "table" }, (0, i.kt)("tr", { parentName: "thead" }, (0, i.kt)("th", { parentName: "tr", align: null }, "name"), (0, i.kt)("th", { parentName: "tr", align: null }, "type"), (0, i.kt)("th", { parentName: "tr", align: null }, "Description"))), (0, i.kt)("tbody", { parentName: "table" }, (0, i.kt)("tr", { parentName: "tbody" }, (0, i.kt)("td", { parentName: "tr", align: null }, (0, i.kt)("em", { parentName: "td" }, "host")), (0, i.kt)("td", { parentName: "tr", align: null }, "string"), (0, i.kt)("td", { parentName: "tr", align: null }, "The request host")), (0, i.kt)("tr", { parentName: "tbody" }, (0, i.kt)("td", { parentName: "tr", align: null }, (0, i.kt)("em", { parentName: "td" }, "auth")), (0, i.kt)("td", { parentName: "tr", align: null }, "string"), (0, i.kt)("td", { parentName: "tr", align: null }, (0, i.kt)("strong", { parentName: "td" }, "Required"), ".You can choose either API key or JWT token for authentication.")))), (0, i.kt)("h2", { id: "docs" }, "Docs"), (0, i.kt)("p", null, "For more information, please visit the site ", (0, i.kt)("em", { parentName: "p" }, (0, i.kt)("a", { parentName: "em", href: "https://docs.aigic.ai" }, "https://docs.aigic.ai")))) } c.isMDXComponent = !0 } }]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[9642],{36911:(e,t,o)=>{o.r(t),o.d(t,{assets:()=>r,contentTitle:()=>n,default:()=>c,frontMatter:()=>s,metadata:()=>u,toc:()=>i});var a=o(87462),l=(o(67294),o(3905));const s={slug:"welcome",title:"Welcome",authors:["slorber","yangshun"],tags:["facebook","hello","docusaurus"]},n=void 0,u={permalink:"/blog/welcome",editUrl:"https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/blog/2021-08-26-welcome/index.md",source:"@site/blog/2021-08-26-welcome/index.md",title:"Welcome",description:"Docusaurus blogging features are powered by the blog plugin.",date:"2021-08-26T00:00:00.000Z",formattedDate:"August 26, 2021",tags:[{label:"facebook",permalink:"/blog/tags/facebook"},{label:"hello",permalink:"/blog/tags/hello"},{label:"docusaurus",permalink:"/blog/tags/docusaurus"}],readingTime:.405,hasTruncateMarker:!1,authors:[{name:"S\xe9bastien Lorber",title:"Docusaurus maintainer",url:"https://sebastienlorber.com",imageURL:"https://github.com/slorber.png",key:"slorber"},{name:"Yangshun Tay",title:"Front End Engineer @ Facebook",url:"https://github.com/yangshun",imageURL:"https://github.com/yangshun.png",key:"yangshun"}],frontMatter:{slug:"welcome",title:"Welcome",authors:["slorber","yangshun"],tags:["facebook","hello","docusaurus"]},nextItem:{title:"MDX Blog Post",permalink:"/blog/mdx-blog-post"}},r={authorsImageUrls:[void 0,void 0]},i=[],p={toc:i};function c(e){let{components:t,...s}=e;return(0,l.kt)("wrapper",(0,a.Z)({},p,s,{components:t,mdxType:"MDXLayout"}),(0,l.kt)("p",null,(0,l.kt)("a",{parentName:"p",href:"https://docusaurus.io/docs/blog"},"Docusaurus blogging features")," are powered by the ",(0,l.kt)("a",{parentName:"p",href:"https://docusaurus.io/docs/api/plugins/@docusaurus/plugin-content-blog"},"blog plugin"),"."),(0,l.kt)("p",null,"Simply add Markdown files (or folders) to the ",(0,l.kt)("inlineCode",{parentName:"p"},"blog")," directory."),(0,l.kt)("p",null,"Regular blog authors can be added to ",(0,l.kt)("inlineCode",{parentName:"p"},"authors.yml"),"."),(0,l.kt)("p",null,"The blog post date can be extracted from filenames, such as:"),(0,l.kt)("ul",null,(0,l.kt)("li",{parentName:"ul"},(0,l.kt)("inlineCode",{parentName:"li"},"2019-05-30-welcome.md")),(0,l.kt)("li",{parentName:"ul"},(0,l.kt)("inlineCode",{parentName:"li"},"2019-05-30-welcome/index.md"))),(0,l.kt)("p",null,"A blog post folder can be convenient to co-locate blog post images:"),(0,l.kt)("p",null,(0,l.kt)("img",{alt:"Docusaurus Plushie",src:o(1102).Z,width:"1500",height:"500"})),(0,l.kt)("p",null,"The blog supports tags as well!"),(0,l.kt)("p",null,(0,l.kt)("strong",{parentName:"p"},"And if you don't want a blog"),": just delete this directory, and use ",(0,l.kt)("inlineCode",{parentName:"p"},"blog: false")," in your Docusaurus config."))}c.isMDXComponent=!0},1102:(e,t,o)=>{o.d(t,{Z:()=>a});const a=o.p+"assets/images/docusaurus-plushie-banner-a60f7593abca1e3eef26a9afa244e4fb.jpeg"}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"use strict";(self.webpackChunkdocusaurus_template_openapi_docs=self.webpackChunkdocusaurus_template_openapi_docs||[]).push([[2535],{45641:e=>{e.exports=JSON.parse('{"title":"Recent posts","items":[{"title":"Welcome","permalink":"/blog/welcome"},{"title":"MDX Blog Post","permalink":"/blog/mdx-blog-post"},{"title":"Long Blog Post","permalink":"/blog/long-blog-post"},{"title":"First Blog Post","permalink":"/blog/first-blog-post"}]}')}}]);
\ No newline at end of file
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment