Learn about all the possibilities of the model behind ChatGPT and why you will be able to use it without paying anything at all
We continue with the stir after the presentation just a few days ago of the new OpenAI LLM or Large Language Model, GPT-4o. I have the feeling that the networks are equally or more excited about this event than they were about ChatGPT in 2022.
But perhaps, even if you have heard about ChatGPT or one of these artificial intelligence models, you don't know what they are used for, how they work or even more importantly, what their price is. Well don't worry that since El TecnoloGuía I'm going to try to answer all those questions.
What is GPT-4o?
This new model arises as the evolution of its predecessor, GPT-4 and the fact that they have added a "either" In the end it is because its nickname comes from "OMNI", from the Latin "everything", so it already gives us a clue of the capabilities it will have.
This new language model wants to cover all possible fields, and although I will go a little deeper into each of them later, I want you to know that apart from text, it is capable of handling images, voice, and even, as we will see later, recognition through live video viewing.
Furthermore, compared to the previous model, it is characterized by the incredible response speed it gives us, since as I am being able to test for myself, the speed with which it returns information to us is at the level of a human.
Here you can see the comparison on the left using ChatGPT 4 and on the right using the new GPT-4o. The agility that the new model brings us is obvious. Incredible…
As you may have seen, the sensation is that it is almost instantaneous, and above all much faster than the model we used until now, and above all it is perceptible when we interact with the model through voice, which even admits that we interrupt it and rejoins the conversation immediately.
How much does ChatGPT with GPT-4o cost?
Since I know you don't want to wait and want to use the new ChatGPT, I'll tell you: The new GPT-4o is completely free for everyone, so just read the article and then follow the instructions that I am going to leave you here.
Until today, to be able to use ChatGPT with all its options you had to pay a monthly fee of €21.99—in the case of Europe—or 20$ if you were an inhabitant of the United States of America.
This payment fee meant that you had more options such as being able to ask more questions - it was limited - you had access to a better model, you could use the - GPT's, which are previously trained models - and ultimately have access to improvements that were not in the free version.
Well, all this has already changed, since with the arrival of GPT-4o There are no differences in usage between paying users and what they have with a completely free account. In both cases they will work exactly the same.
The only real difference is that those of us who are users of ChatGPT Plus —yes, I include myself— We are going to have a limit 5 times higher than those who have a free account. In this way, although there is no absolute data, it seems that we will be able to have many more interactions.
To register you just have to go to the official ChatGPT website and register to start using the model, although if you are not going to pay it may take a few days before you can see it in all its splendor.
Can ChatGPT be used now with the new model?
The answer is Yes and no. On the one hand, it is evident that at the text level GPT-4o is completely integrated within ChatGPT PLUS since we can choose the model within the drop-down menu of the application itself, so it is present.
But at the moment it does not have all the options that we saw in the presentation and that I will show you later. They have already warned that it will be implemented little by little, first for paying users and then for everyone else.
In any case, even users PLUS We are going to have to wait to see GPT-4o completely, especially in Europe where it will arrive later than the USA.
What can we do with GPT-4o?
Since we know what it is and we have seen the enormous improvement over GPT-4, let's see how it can help us, what improvements it has and what new ways it has to communicate with the real world. TI warn you that what you are going to see next is the future, today…
In these lines I am going to leave you the link to the entire presentation since it is only 30 minutes and it is worth watching from start to finish, but We are going to break down little by little the capabilities that it will have this new language model depending on the type of interaction we do.
Turn food photos into a recipe
Reverse cooking. Imagine that you have a dish in front of you and you don't know what ingredients it contains or how to cook it. Well, it's not a problem anymore. Send a photo to Chat GPT and let it tell you how to prepare that sandwich so appetizing And it also helps you with math...
We will have our personal tutor
One of the most incredible uses is that we can ask you for the answer to a problem, but what if I want you to tell me how to get to it? Exactly, like a teacher. Well, GPT-4o is now capable of acting as such and guiding us step by step, correcting errors.
Translator in real time
Imagine that you are going on a trip to a country where you do not know the language. Can you imagine that language was no longer a barrier to traveling? GPT-4o is capable of translate a conversation completely instantly and act as an interpreter.
A companion for historians
Well, and for anyone who has a text and is not able to read it. CharGPT has demonstrated to be able to transcribe handwritten texts from more than two centuries ago with a success of almost 100%. There will no longer be a document that will resist us.
Technical chart analysis
Now you can help with trading analyzing the charts of coins or cryptocurrencies. It is not a use that I would give it, but I must admit that the detailed study it does simply with the graph is impressive.
Vision mode to help the blind
It is as if the accessibility mode of electronic devices were transferred to real life. This new model sees us and tells us what is in front of us. The progress for people with vision problems is overwhelming.
And like these many more and those to come. These examples are just a sample of the potential of GPT-4o and its capabilities. I don't want to imagine what the next language models that we have available will be like.
The future of GPT-4o in the AI Gadgets
For the field that I deal with on this website, Devices with Artificial Intelligence have been a great advance, since at least the Humane AI Pin and the rabbit r1 They have confirmed that their devices will have GPT-4o on their system.
This would definitely mean leaving the mobile phone aside for many of the functions for which we can use these gadgets, such as image recognition or searching for information. We just need to see what path these companies take. I'm looking forward to seeing it...