OpenAI has released GPT-4, the latest version of its hugely popular artificial intelligence chatbot ChatGPT.

The new model can respond to images - providing recipe suggestions from photos of ingredients for example, as well as writing captions and description

OpenAI announces ChatGPT successor GPT-4

The new model can respond to images - for example, it can offer recipes based on photos of items, write captions and descriptions.

It can handle 25,000 words, eight times more than ChatGPT.

Millions of people have been using ChatGPT since its launch in November 2022.

Popular requests for it include songwriting, poetry, promotional copy, computer coding, and homework help, although teachers say students shouldn't use it.

ChatGPT answers questions in human-like natural language, and in 2021 it will use the Internet as a knowledge base to reflect other forms of writing, such as songwriters and writers.

There is concern that one day they will take on many of the jobs people do today.

OpenAI says it spent six months working on GPT-4's security features, training it based on human feedback. However, it warns that they may still be prone to leaking false information.

GPT-4 will first be available to ChatGPT Plus subscribers who pay $20 per month for premium access to the service.

It currently powers Microsoft's Bing search engine platform. The tech giant has invested $10 billion in OpenAI.

The live demo provided answers to complex tax questions, but there was no way to validate the answers.

Like ChatGPT, GPT-4 is a form of generative artificial intelligence. Generative AI uses algorithms and text prediction to create new content based on your requests.

GPT-4 generates recipes from an image

According to OpenAI, GPT-4 has "improved reasoning capabilities" compared to ChatGPT. For example, a model that finds available meeting times for three schedules.

OpenAI also announced new partnerships with language learning app Duolingo and Be My Eyes, an app for the blind that builds AI chatbots.

However, OpenAI warned that, like its predecessors, GPT-4 is still not completely reliable and can cause "hallucinations" - a phenomenon in which AI misrepresents facts or makes logical errors.

Getting Info...

Post a Comment

Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.