Napalm Automation

All about API

AI models

OpenAI has released an API for commercial use that provides access to new artificial intelligence models developed by the company. The development notes that the API offers a universal “text in, text out” interface, allowing users to use it for almost any task in English. The software can translate, write stories and poems and answer everyday questions.

OpenAI explained that the API can be “programmed,” showing only a few examples of finished work. The software also allows performance to be tuned for specific tasks by training on the data sets provided.

The API gives access to models on GPT-3 with improved speed and throughput.

As the company stressed, OpenAI cannot foresee all the possible consequences of using the technology, which is why it was decided to open the API.

“We hope that the API will significantly lower the barrier to producing useful AI-based products, resulting in tools and services that are difficult to imagine today,” the company said.

Anyone interested in gaining access to the beta can join the likes of Algolia, Quizlet and Reddit, as well as the Middlebury Institute for International Studies. They have been given access to the API by invitation. Over time, the practice plans to expand, and subsequently the software will become publicly available.

The API has been the product of years of research and processing of the largest text databases. As noted by Eli Chen, CEO of the startup Veriph.ai, who has tried an earlier version of the product, there is a sense that the API has combined “all human knowledge.”

Greg Brockman, co-founder and chief technology officer of OpenAI, sees the new product as a major advance in artificial intelligence, the first step toward incorporating AI into virtually every piece of software. He promises that OpenAI will be careful in its handling of the technology so that it doesn’t do “harm.” “It’s hard to anticipate everything that could happen,” Brockman noted. However, he said, it’s better to test the technology now, while it’s easily controllable, and learn the necessary lessons.

In late May, OpenAI showed off its new GPT-3 algorithm, designed for texting based on just a few examples. Its architecture is similar to GPT-2, but the model was trained on 175 billion parameters or 570 gigabytes of text. GPT-3 can answer questions on the text it reads, as well as write poems, solve anagrams, and do translations. It only needs 10 to 100 examples to learn.

Related Posts