ba4c3309 748f 4557 8eb6 a6d00b33ec54

KitOps 1.0 Launch: Production-Ready and Eyeing CNCF Integration

Spread the love

KitOps has officially launched version 1.0, marking a significant milestone for the project, which was initiated by Jozu to address the complexities associated with AI/ML packaging and versioning.

Designed as an open-source solution, KitOps aims to deliver speed, security, and consistency across various computing environments.

The project has been submitted to the Cloud Native Computing Foundation (CNCF) Sandbox, signaling its intent to deepen its integration within the cloud-native ecosystem.

At the core of KitOps is the ModelKit, a standardized OCI artifact that facilitates the management, security, and auditing of AI/ML projects.

This tool is designed to enhance collaboration among data scientists, application developers, and site reliability engineers (SREs).

To support these diverse user groups, KitOps offers a command-line interface (CLI) called ‘kit,’ which can be utilized on desktops, integrated into CI/CD pipelines, and embedded within other tools. Additionally, KitOps provides a GitHub Action and a Dagger module to streamline AI/ML workflows.

For data scientists, KitOps introduces the PyKitOps library and an MLFlow plugin, enabling seamless sharing of ModelKits across OCI registries. These tools simplify the process of versioning and deploying machine learning models, making it easier for teams to collaborate and maintain consistency across different environments.

Since its inception, KitOps has seen over 45,000 installations and is actively used in production by private enterprises with global operations, as well as by security-focused public sector development teams. This widespread adoption underscores KitOps’ reliability and effectiveness in real-world scenarios.

The 1.0 release brings several new features and improvements. Key enhancements include better integration capabilities, improved performance, and enhanced security measures. These updates are designed to meet the evolving needs of the AI/ML community, ensuring that KitOps remains a robust and versatile tool for managing machine learning workflows.

Looking ahead, the KitOps team is focused on further development and community engagement. By joining the CNCF Sandbox, KitOps aims to leverage the CNCF’s resources and community to accelerate its growth and adoption. The project plans to continue evolving based on user feedback, with a strong emphasis on open-source collaboration and innovation.

How to Install and Use KitOps

Installing KitOps

Kitops can be installed in different operating systems. Download the binary from the official KitOps Github Repository and place it in the PATH of your operating system so that it is accessible via the command line or terminal.

Using KitOps

To begin, you can either use one of KitOps’ sample ModelKits or import a ModelKit directly from Hugging Face. To retrieve a sample ModelKit from Jozu Hub, execute the following command:

kit unpack jozu.ml/jozu-quickstarts/fine-tuning:latest

This command will unpack the entire ModelKit onto your machine. KitOps also allows selective unpacking, letting you extract specific components such as the model, dataset, code, or configuration. Refer to the unpack command documentation for more details.

View the Unpacked Files

Once unpacked, list the directory contents to verify, you should see files such as:

Kitfile: The manifest detailing your ModelKit

README.md: Documentation for guidance

Model files (e.g., llama3-8b-8B-instruct-q4_0.gguf)

lora-adapter.gguf

training-data.txt

The Kitfile is essential, as it defines the ModelKit structure and can be inspected using the info and inspect commands.

Verify the Local RepositoryCheck your local repository for existing ModelKits:

kit listIf this is your first time, the table may be empty.

Pack the ModelKit

To create a ModelKit, pack your files using:

kit pack . -t jozu.ml/brad/quick-start:latest

Replace jozu.ml/brad/quick-start:latest with your specific registry, user, repository, and tag names. Upon successful packing, you’ll receive confirmation messages indicating each saved component.

Review the Local Repository Again

Confirm the ModelKit has been added:

kit list

Remove an Incorrect ModelKitIf you need to correct errors in a ModelKit, you can remove it from the local repository before repacking.

Push the ModelKit to a Remote Repository

Upload your ModelKit to a remote repository:

kit push jozu.ml/brad/quick-start:latest

Ensure the repository exists and you have push permissions. Some registries like Jozu Hub require manual repository creation.

These steps mentioned above can be automated via Github Actions and Modelkits can be deployed in Kubernetes to ensure scalability of the model service.

Conclusion

The release of KitOps 1.0 represents a major achievement for the project and its community. With its proven track record in production environments and its commitment to continuous improvement, KitOps is well-positioned to become a key player in the cloud-native AI/ML landscape. As it progresses within the CNCF ecosystem, KitOps will continue to empower organizations to manage their AI/ML projects with greater efficiency, security, and consistency. You can learn more about KitOps from their official website, and contribute to the project on Github.

If you are interested in AI Deepseek is one very popular LLM tool you should know more about. If you also want to know how to run an LLM models locally, so that you can run your prompts without the internet? We got you covered on how to achieve that here.


Spread the love

Leave a Comment

Your email address will not be published. Required fields are marked *

×