Go to file
ritual-all 2a11fd3953
Merge pull request #1 from ritual-net/staging
fix: add gitmodules to repo
2024-03-29 11:51:10 -04:00
.github/workflows feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
deploy feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
projects fix: add gitmodules to repo 2024-03-29 11:18:34 -04:00
.gitignore feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
.gitmodules feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
.pre-commit-config.yaml feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
Makefile feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
README.md feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
requirements.txt feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00

infernet-container-starter

Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate the true power of infernet, and the wide range of applications that can be built using it:

Examples

  1. Hello World: Infernet's version of a hello-world program. Here, we deploy a container that simply echoes back the input to us.
  2. Running a Torch Model on Infernet: This example shows you how to deploy a pre-trained pytorch model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
  3. Running an ONNX Model on Infernet: Same as the previous example, but this time we deploy an ONNX model to infernet.
  4. Prompt to NFT: In this example, we use stablediffusion to mint NFTs on-chain using a prompt.
  5. TGI Inference with Mistral-7b: This example shows you how to deploy an arbitrary LLM model using Huggingface's TGI, and use it with an infernet node.
  6. Running OpenAI's GPT-4 on Infernet: This example shows you how to deploy OpenAI's GPT-4 model to infernet.