.github/workflows | ||
deploy | ||
projects | ||
.gitignore | ||
.gitmodules | ||
.pre-commit-config.yaml | ||
0x8d871ef2826ac9001fb2e33fdd6379b6aabf449c_abi.json | ||
CHANGELOG.md | ||
internal.mk | ||
Makefile | ||
playbook.yml | ||
PUBLISHING.md | ||
pyproject.toml | ||
README.md | ||
requirements.txt | ||
toChecksumAddress.py | ||
update.sh |
infernet-container-starter
Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate the true power of infernet, and the wide range of applications that can be built using it:
Examples
- Hello World: Infernet's version of a
hello-world
program. Here, we deploy a container that simply echoes back the input to us. - Running a Torch Model on Infernet: This example shows you how to deploy a pre-trained pytorch model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
- Running an ONNX Model on Infernet: Same as the previous example, but this time we deploy an ONNX model to infernet.
- Prompt to NFT: In this example, we use stablediffusion to mint NFTs on-chain using a prompt.
- TGI Inference with Mistral-7b: This example shows you how to deploy an arbitrary LLM model using Huggingface's TGI, and use it with an infernet node.
- Running OpenAI's GPT-4 on Infernet: This example shows you how to deploy OpenAI's GPT-4 model to infernet.