Go to file
2024-09-25 17:52:04 +03:00
.github/workflows feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
deploy bump ver 2024-09-25 17:52:04 +03:00
projects change settings 2024-09-21 20:05:07 +03:00
.gitignore infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
.gitmodules infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
.pre-commit-config.yaml feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
0x8d871ef2826ac9001fb2e33fdd6379b6aabf449c_abi.json add files 2024-09-15 00:42:15 +03:00
CHANGELOG.md infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
internal.mk infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
Makefile Update 'Makefile' 2024-06-17 03:36:18 +03:00
playbook.yml add cmd in bash history 2024-09-24 14:54:38 +03:00
PUBLISHING.md infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
pyproject.toml infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
README.md feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
rebuild.sh add sh files 2024-09-21 23:51:36 +03:00
requirements.txt infernet-1.0.0 update 2024-06-06 13:18:48 -04:00
toChecksumAddress.py Add 'toChecksumAddress.py' 2024-06-08 23:55:36 +03:00
update_contracts.sh add sh files 2024-09-21 23:51:36 +03:00
update.sh Update 'update.sh' 2024-06-09 03:32:34 +03:00

infernet-container-starter

Welcome to this repository! 🎉 This repo contains a series of examples that demonstrate the true power of infernet, and the wide range of applications that can be built using it:

Examples

  1. Hello World: Infernet's version of a hello-world program. Here, we deploy a container that simply echoes back the input to us.
  2. Running a Torch Model on Infernet: This example shows you how to deploy a pre-trained pytorch model to infernet. Using this example will make it easier for you to deploy your own models to infernet.
  3. Running an ONNX Model on Infernet: Same as the previous example, but this time we deploy an ONNX model to infernet.
  4. Prompt to NFT: In this example, we use stablediffusion to mint NFTs on-chain using a prompt.
  5. TGI Inference with Mistral-7b: This example shows you how to deploy an arbitrary LLM model using Huggingface's TGI, and use it with an infernet node.
  6. Running OpenAI's GPT-4 on Infernet: This example shows you how to deploy OpenAI's GPT-4 model to infernet.