ritual/README.md

20 lines
1.4 KiB
Markdown
Raw Permalink Normal View History

# infernet-container-starter
Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate
the true power of infernet, and the wide range of applications that can be built using
it:
## Examples
1. [Hello World](projects/hello-world/hello-world.md): Infernet's version of a `hello-world` program. Here, we deploy
a container that simply echoes back the input to us.
2. [Running a Torch Model on Infernet](projects/torch-iris/torch-iris.md): This example shows you how to deploy a pre-trained [pytorch](https://pytorch.org/)
model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
3. [Running an ONNX Model on Infernet](projects/onnx-iris/onnx-iris.md): Same as the previous example, but this time we deploy
an ONNX model to infernet.
4. [Prompt to NFT](projects/prompt-to-nft/prompt-to-nft.md): In this example, we use [stablediffusion](https://github.com/Stability-AI/stablediffusion) to
mint NFTs on-chain using a prompt.
5. [TGI Inference with Mistral-7b](projects/tgi-llm/tgi-llm.md): This example shows you how to deploy an arbitrary
LLM model using [Huggingface's TGI](https://huggingface.co/docs/text-generation-inference/en/index), and use it with an infernet node.
6. [Running OpenAI's GPT-4 on Infernet](projects/gpt4/gpt4.md): This example shows you how to deploy OpenAI's GPT-4 model
to infernet.