Developers are adopting AI-powered code generators—services like GitHub Copilot and Amazon CodeWhisperer, along with open access models like Meta’s CodeLlama—in surprising price. But the tools are far from perfect. Many are not free. Others are, but only with licenses that exclude their use in common commercial contexts.
Sensing the demand for alternatives, startup Hugging Face several years ago teamed up with workflow automation platform ServiceNow to create StarCoder, an open source code generator with a less restrictive license than some of the others out there. The original was released online early last year, and work has since begun on a sequel, StarCoder 2.
StarCoder 2 is not a single code generation model, but rather a family. Released today, it comes in three variants, the first two of which can run on most modern consumer GPUs:
- A 3 billion parameter (3B) model trained by ServiceNow
- A 7 billion parameter (7B) model trained by Hugging Face
- A 15 billion parameter (15B) model trained by Nvidia, the newest supporter of the StarCoder project.
(Note that “parameters” are the parts of a model that are learned from training data and essentially determine the model’s ability at a problem, in this case code generation.)
Like most other code generators, StarCoder 2 can suggest ways to fill in incomplete lines of code, as well as summarize and retrieve code snippets when prompted in natural language. Trained with 4x more data than the original StarCoder, StarCoder 2 offers what Hugging Face, ServiceNow and Nvidia describe as “significantly” improved performance with lower running costs.
StarCoder 2 can be fine-tuned “in hours” using a GPU like the Nvidia A100 on first- or third-party data to build apps like chatbots and personal coding assistants. And, because it was trained on a larger and more diverse dataset than the original StarCoder (~619 programming languages), StarCoder 2 can make more accurate context-aware predictions — at least hypothetically.
“StarCoder 2 was built specifically for developers who need to build apps quickly,” Harm de Vries, head of ServiceNow’s StarCoder 2 development team, told TechCrunch. “With StarCoder2, developers can use its capabilities to make coding more efficient without sacrificing speed or quality.”
Now, I would venture to say that not every developer would agree with De Vries on the speed and quality points. Code generators promise to streamline some coding tasks — but at a cost.
A recent Stanford study found that engineers who use code generation systems are more likely to introduce security vulnerabilities into the applications they develop. Elsewhere, a voting by Sonatype, the cybersecurity company, shows that the majority of developers are concerned about a lack of knowledge about how code generators produce code, and “code sprawl” from generators that produce too much code to manage.
StarCoder 2’s license may also prove to be an obstacle for some.
StarCoder 2 is licensed under Hugging Face’s RAIL-M, which aims to promote responsible use by imposing “light touch” restrictions on both model licensees and downstream users. Although less restrictive than many other licenses, RAIL-M is not truly “open” in the sense that it is not permission developers to use StarCoder 2 each potential application (medical advice applications are strictly off-limits, for example). Some commentators say that RAIL-M’s requirements may be too vague to comply with in every case — and that RAIL-M could conflict with AI-related regulations such as the EU’s Artificial Intelligence.
Putting all that aside for a moment, is StarCoder 2 really superior to the other code generators out there — free or paid?
Depending on the benchmark, it seems to be more efficient than one of the CodeLlama versions, CodeLlama 33B. Hugging Face says the StarCoder 2 15B matches the CodeLlama 33B on a subset of code completion tasks at twice the speed. It is not clear which tasks. Hugging Face did not elaborate.
StarCoder 2, as an open-source collection of models, also has the advantage of being able to develop locally and “learn” a developer’s source code or code base – an attractive prospect for developers and companies wary of exposing code to AI that hosted in the cloud. In 2023 overview from Portal26 and CensusWide, 85% of businesses said they were wary of adopting GenAI-style code generators because of privacy and security risks — such as employees sharing sensitive information or salespeople being trained on proprietary data.
Hugging Face, ServiceNow and Nvidia also argue that StarCoder 2 is more ethical – and less legally fraught – than its rivals.
All GenAI models are scrambled – in other words, they spit out a copy of the data they were trained on. It doesn’t take an active imagination to see why this could get a developer into trouble. With code producers trained in copyrighted code, it is entirely possible, even with filters and additional safeguards, that producers may inadvertently suggest copyrighted code and not mark it as such.
Some vendors, including GitHub, Microsoft (GitHub’s parent company), and Amazon, have committed to providing legal coverage in cases where a code-generating customer is accused of copyright infringement. But coverage varies from vendor to vendor and is generally limited to enterprise customers.
Unlike code producers trained using copyrighted code (GitHub Copilot, among others), StarCoder 2 was trained only on data licensed from Software Heritage, the non-profit organization that provides archiving services for code. Prior to the StarCoder 2 training, BigCode, the organizing team behind much of the StarCoder 2 roadmap, gave code holders the opportunity to opt out of the training set if they wished.
As with the original StarCoder, StarCoder 2’s training data is available for developers to split, replicate, or control as they wish.
Leandro von Werra, machine learning engineer at Hugging Face and co-head of BigCode, pointed out that while there has been a proliferation of open source generators recently, few have been accompanied by information about the data used to train them and, indeed, how they were trained.
“From a scientific point of view, one issue is that the training is not reproducible, but also as a data producer (i.e. someone who uploads their code to GitHub), you don’t know if and how your data was used,” Von Werra said in a interview. “StarCoder 2 addresses this issue by being completely transparent throughout the entire training pipeline from pre-training data scraping to training itself.”
StarCoder 2 isn’t perfect, he said. Like other code generators, it is prone to bias. De Vries notes that it can create code with elements that reflect gender and race stereotypes. And because StarCoder 2 was trained on mostly English comments, Python and Java code, it performs weaker on languages other than English and on “low-resource” code like Fortran and Haksell.
However, Von Werra claims it’s a step in the right direction.
“We strongly believe that building trust and accountability with AI models requires transparency and control over the full model pipeline, including the training data and training recipe,” he said. “StarCoder 2 [showcases] how fully open models can deliver competitive performance.”
You might be wondering—as is this writer—what motivation Hugging Face, ServiceNow, and Nvidia have for investing in a project like StarCoder 2. They are businesses, after all—and training models don’t come cheap.
As far as I can tell, it’s a tried and true strategy: building goodwill and building paid services over open source versions.
ServiceNow has already used StarCoder to build Now LLM, a code generation product optimized for ServiceNow workflow patterns, use cases and processes. Hugging Face, which offers consulting model implementation plans, provides hosted versions of StarCoder 2 models on its platform. So does Nvidia, which makes StarCoder 2 available through an API and web front-end.
For developers specifically interested in the no-cost offline experience, StarCoder 2 — the models, source code, and more — can be downloaded from the project’s GitHub page.