Go to file
2024-06-08 23:55:36 +03:00
.github/workflows feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
deploy infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
projects Update 'projects/hello-world/contracts/script/Deploy.s.sol' 2024-06-08 23:20:16 +03:00
.gitignore infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
.gitmodules infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
.pre-commit-config.yaml feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
CHANGELOG.md infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
internal.mk infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
Makefile fix: quickfix index 2024-06-06 13:54:55 -04:00
PUBLISHING.md infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
pyproject.toml infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
README.md feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
requirements.txt infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
toChecksumAddress.py Add 'toChecksumAddress.py' 2024-06-08 23:55:36 +03:00
update.sh Create update.sh 2024-06-08 20:13:10 +03:00

infernet-container-starter

Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate the true power of infernet, and the wide range of applications that can be built using it:

Examples

  1. Hello World: Infernet's version of a hello-world program. Here, we deploy a container that simply echoes back the input to us.
  2. Running a Torch Model on Infernet: This example shows you how to deploy a pre-trained pytorch model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
  3. Running an ONNX Model on Infernet: Same as the previous example, but this time we deploy an ONNX model to infernet.
  4. Prompt to NFT: In this example, we use stablediffusion to mint NFTs on-chain using a prompt.
  5. TGI Inference with Mistral-7b: This example shows you how to deploy an arbitrary LLM model using Huggingface's TGI, and use it with an infernet node.
  6. Running OpenAI's GPT-4 on Infernet: This example shows you how to deploy OpenAI's GPT-4 model to infernet.