IBM is introducing decoder-only code models for code generative tasks as part of its Granite collection. The models have been trained with code written in 116 programming languages and range in size from 3 to 34 billion parameters. https://www.i-programmer.info/news/105-artificial-intelligence/17226-ibm-launches-the-granite-code-llm-series.html
A dairy of my work.Just links to the full artices on i-programmer.info