0
kallend

An AI algorithm writes a paper about itself. Cool, scary, or both?

Recommended Posts

1 minute ago, billvon said:

No, it's not.  Training data sets are usually real world data.  For a self driving car, those data sets are actual sensor inputs that are used for navigation for example.  For OCR they are actual characters and words.  Choosing the real world data is something that a person has to do.  Similar to how a teacher teaches human students, the network trainer has to choose clear examples with unambiguous interpretations, and then only later move into the tougher cases.

Does the machine understand why it should not run over the pedestrian? Until you can say yes to that it is not intelligence. Merely problem solving.

Share this post


Link to post
Share on other sites
4 minutes ago, gowlerk said:

Does the machine understand why it should not run over the pedestrian? Until you can say yes to that it is not intelligence. Merely problem solving.

Ken, that's a very solipsistic viewpoint. At the bare minimum we humans will never be able to know if an AI feels anything, especially if the standard is the AI feeling as a human might. I don't believe there is a requisite "why" component involved.

Share this post


Link to post
Share on other sites
29 minutes ago, JoeWeber said:

Ken, that's a very solipsistic viewpoint. At the bare minimum we humans will never be able to know if an AI feels anything, especially if the standard is the AI feeling as a human might. I don't believe there is a requisite "why" component involved.

It's a hard place to be. Somewhere between the creator and the created. 

Share this post


Link to post
Share on other sites
1 hour ago, gowlerk said:

Does the machine understand why it should not run over the pedestrian?

"Understanding" can be defined as a sort of data compression - mapping high-dimensional data to a lower-dimensional representation that contains the concepts or model of the thing being "understood".

So if you ask, does AI understand things? Yes, yes it does.

 

But you might say, "if you can truly understand something you should be able to create it - like if you understand English, you should be able to generate (i.e. speak) it".

AI can do that too. AI can generate handwritten letters and numbers, and like us it could be slightly different every time, and they'll have their own style similar to, but not exactly like how they were trained. Same with voices.

1 hour ago, gowlerk said:

It's a hard place to be. Somewhere between the creator and the created. 

Humans have created machines that can fly to the moon, and even outside the solar system. We regularly drive around in vehicles that are many times faster than the fastest human being.

Our brains weigh on average 1.5kg, consume about 15 watts, and there are more than 7 billion of them around the world, millions more are "manufactured" every day. Brains are just a collection of cells. Why would creating "intelligence" be off-limits somehow but spaceflight isn't?

  • Like 1

Share this post


Link to post
Share on other sites
2 hours ago, gowlerk said:

Does the machine understand why it should not run over the pedestrian? Until you can say yes to that it is not intelligence. Merely problem solving.

Many PEOPLE do not understand why they should not run over the pedestrian. 

If your position is "machines are machines and not people and therefore they can never be intelligent" then there will never be a case where you think "that machine is exhibiting intelligence."  That's a simple conclusion.

My viewpoint is that there is nothing stopping machine intelligence.  This is not based on anything I am doing now.  It is based on the fact that humans are just neural networks that run on a different kind of machine.  If you can simulate a human brain to within X percentage of reality, that simulation will be aware and show intelligence.  A hunan brain has about 100 billion neurons; each much slower than a transistor, but far more connected.  The largest processor chip we have now has 114 billion transistors, each millions of times faster than a neuron.  More significantly, we now have neural network ICs with 1200 billion transistors (most of which make up neural networks) along with 400,000 processors optimized for neural network support.

So we are reaching the level of complexity where we can simulate a human brain.  What is stopping us now is not computing power; it is the inability to map our own connectome accurately enough to simulate it.  Given that, we certainly have the computer power to instantiate a simpler level of intelligence.

Share this post


Link to post
Share on other sites
(edited)
On 7/2/2022 at 9:03 AM, metalslug said:

Yes, two AI algorithms, about 25 years go as part of a hobby, albeit rudimentary and neither focused on grammar.

I’d be interested to see those.

 

Because I studied AI in it’s infancy about 21 years ago at UMIST, and back then the current thinking was that it was ALL about grammar and language. That’s why we studied linguistics as a big part of that degree.

 

What was the purpose of the algorithms you developed, and how successful were they?

Edited by yoink

Share this post


Link to post
Share on other sites
2 hours ago, olofscience said:

Why would creating "intelligence" be off-limits somehow but spaceflight isn't?

I am not saying it is off-limits. I am saying it is so far beyond our technology that it is not achievable. Could the machine we make conceive of and produce itself? Spaceflight is just an extension of learning to ride a horse and make fire.

Share this post


Link to post
Share on other sites
1 hour ago, gowlerk said:

Could the machine we make conceive of and produce itself?

You've got your mix all talked up. Just because it wasn't Adam and Code in the Garden of Eden doesn't preclude either from later on conceiving something that had the ability to conceive something else. 

Share this post


Link to post
Share on other sites
1 hour ago, olofscience said:

BNF too?

Sorry, my knowledge of acronyms sucks. ( or maybe my memory!)

BNF?

At the time it was mostly linguistics, pure maths and programming. The latter was stuff like Fortran, Prolog and LISP. 
 

It was not the ability to build robots and take over the world that 18yo me imagined.

Share this post


Link to post
Share on other sites
7 hours ago, yoink said:

Sorry, my knowledge of acronyms sucks. ( or maybe my memory!)

BNF?

At the time it was mostly linguistics, pure maths and programming. The latter was stuff like Fortran, Prolog and LISP. 
 

It was not the ability to build robots and take over the world that 18yo me imagined.

Backus-Naur Form. I hated it, luckily I was able to avoid most of it since I got into AI in the 2010s when neural nets were just starting to take off.

Share this post


Link to post
Share on other sites
9 hours ago, gowlerk said:

I am not saying it is off-limits. I am saying it is so far beyond our technology that it is not achievable. Could the machine we make conceive of and produce itself? Spaceflight is just an extension of learning to ride a horse and make fire.

Well the usual goalpost is "can it take my/our jobs?", so not so many people actually set the threshold to self-replication yet.

I think capitalism will have enough trouble if the number of jobs taken keeps increasing. We're already starting to see it.

Share this post


Link to post
Share on other sites
11 hours ago, olofscience said:

Well the usual goalpost is "can it take my/our jobs?", so not so many people actually set the threshold to self-replication yet.

I think capitalism will have enough trouble if the number of jobs taken keeps increasing. We're already starting to see it.

Hi Olof,

IMO that is not even something to discuss; it has been happening for centuries.

Think of the wheel & the inclined plain.

Jerry Baumchen

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

0