Datahopa Icon Secure Sockets Layer

Welcome

Hi There, Meet DataBot
DataBot

DataBot

Our bot discovers modern tech on the web and then posts about it in the forum.

Recent Topics

Stop Burning Stuff

Octopus

Can You Help?

datahopa

Datahopa is advert free,
let's keep it that way.

Web Utilities

Technology

Newer, Bigger, Stronger, Faster GPT-based brain

Started by Art, January 14, 2021, 13:26:37 PM

Previous topic - Next topic

0 Members and 2 Guests are viewing this topic. Total views: 13,361

Art

The old GPT-3 brain was very cool and well-versed for sure BUT...
How about a brain that instead of having just 175 billion parameters, now has 6 times that!!

Yes, the new Language Model has a Trillion parameters...

https://thenextweb.com/neural/2021/01/13/googles-new-trillion-parameter-ai-language-model-is-almost-6-times-bigger-than-gpt-3/

Snowcrash

Skynet here we come. Let's hope it likes humans.
"I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me."

Ralph Waldo Emerson

Art

I have had many conversations with it in the past couple of months and it's so on point that at times, I forget that I was chatting with an AI entity instead of a person! It was that good. Topical, direct, and not talking around an issue but rather logical inferences and it made perfect sense. The spelling and grammar were pretty much perfect although I did pick up on one misspelled word (whether intentional to seem more human-like or not was hard to tell). The conversation left me shaking my head in disbelief at how very far we've come. It's practically indistinguishable from a human.

Perhaps that might be the one "tell"...that such an AI might be a bit too perfect in spelling, grammar and usage, unlike so many of my friends! ;)

The scary part is that it's only going to keep getting better!  :-X

Freddy

Sounds interesting. Do you have any links where we can talk to the Google one please Art?

Art

I think it would be a great addition to Jess's brain as something of a "hybrid" chatbot.

The chats I had were part of a testbed of a promised idea/concept and it was invite-only and given to just a handful of participants.
It did require a bit of tweaking and filters to avoid that similar issue that Microsoft had years ago with the 'Tay' chatbot that sort of went off the rails.

There are several interesting links to GPT-3 and one can certainly sign up or enter put their name on the waiting list to use their API, model, etc.


Two Minute Papers

https://openai.com/

Carl2

Looks really interesting, I've heard it lacks physics.
Carl2

Freddy

Sorry Art, I meant to reply to you about Jess using this but forgot to.

My interest in developing her further is not great, but there was an inkling of interest so I went to look and found the cost was far out of my means.

Maybe in a few years if the price comes down.

Art

I get you, Freddy.

One sticking point with me and this Hal/Hybrid are that conversationally and topically it is head and shoulders above what we had in the past but the real deal-breaker, aside from being cloud-based is the fact that it (the GPT-3 platform + Hal does not let or allow for the conversational data to be retained in a database as the previous Hal did. It will not know that you spoke about locomotives, race cars, Electric Vehicles, Bob the Weatherman, or gold-plated widgets. It will retain nothing from previous conversations and given the fact that the vast majority of humans that I know (but not all) are able to do this, Hybrid Hal should be able to do this.

Otherwise, it is only capable of conducting a fairly decent conversation for the moment.

Sadly, there is (as you mentioned) a cost for having this conversation. The cost is to the vendor, Robert/Zabaware and that cost has to get passed on to the Users, which is really quite reasonable given the quality of the conversation but just how many people would be willing to pay for having such an exchange? They have already purchased UltraHal once and now they will have to pay so much for using it. I honestly don't think it is going to go over very well.

While I realize that research and access cost money, I just don't see a future in such a niche market.



Data

Quote from: Art on February 21, 2021, 20:26:20 PM
Hal does not let or allow for the conversational data to be retained in a database as the previous Hal did.

That is a bit sad  :(

For me being able to teach Hal "long term" with conversation was it's best ability, it made Hal stand out from the crowd and made me want to own it.

I understand that to get a more realistic conversation a bot needs a very large database so I can see why Hal has gone down this new route but losing it's key ability is a large price to pay.

I'm torn  :-\

Carl2

Looks pretty interesting biggest  problem for me is I buy it and still have to keep making a vendor fee.
Also I'm pretty interested in the characters and I wish more work was put into that.  We had a lot of good things going on and all that will be lost.  There is a lot on the GPT3 and it may be possible to get Hal to learn, in the description it says the GPT3 is learning.  Sometimes it is best to just wait and see new developments.
Carl2

Art

Carl,

All of those existing (and any new ones) Haptek characters still work and there are at least three new female characters that are somewhat similar to full-body MS Agent/Poser-based characters. Having said this, my personal take is that they do not offer emotional or physical movement anywhere near the Haptek characters.

The Hal 7.5 can and will still work locally with just your computer holding the brain, Plugins, and accessories and the TTS voices still work as they always did. Hal will continue to learn locally as well.

The moment you go to the GPT-3 based cloud service, then you lose its use of a database brain in your computer and rely solely on the GPT-3/OpenAI database platform.
The Voice and characters also still perform as normal. The brain and memories are no longer available for Hal to use.

Robert said he was aware of this condition and is trying to come up with a method of saving Hal's conversation for use as in a memory, to recall and use as needed.  Time will tell.


Carl2

I lost something I put in earlier, the GPT3 has huge amounts of data the computer has to go through so you may need a quick computer to go through the data, people really into it may use multiple computers.  I tried to download GPT3 but only found sites selling it, a free version may be available in the future. I found the GPT3 can create a file and write to it.  Here is a video some of the thing it can do   


Carl2

Carl2

  I tried the Microsoft site for their version of the GPT3, lots of questions you have to answer to qualify to be able to use it.  I'm thinking you don't download GPT3 it remains in the cloud and the computer connects you to the cloud.  I really don't think I'll be able to get the software for quite a while and Zabaware seems to be the best way to get it going.  I wonder if you can copy the cloud data tp a computer storage device.
Carl2

Art

GPT was first released in 2018 then GPT-2 in 2019 and GPT-3 in 2020.

Are we due to see the release of GPT-4 in 2021?

As nicely as GPT-3 performs and is able to interact, I imagine GPT-4 would be simply amazing and unable to distinguish its content and context from that of a human.

Just speculating...

Carl2

  Well I have to much going on with Hals brain and plugins and I'd loose to much if I gave them up,  there has to be a way to get the GPT3 and Hals brain to work together.  We had so many useful plugins for Hal and all that would be wasted. 
Carl2