Human brain and credit card numbers
These days, I work in the field of machine learning, which is a sub field of the field artificial intelligence. Today's hottest machine learning method is deep learning.
Deep learning was known as neural networks when I grew up in the late 90's. The fundamental motivation of neural networks is to represent the human brain using computers (both hardware and software). The field has a long history, beginning in 1958 with Frank Rosenblatt's paper, "The Perceptron: A probabilistic model for information storage and organization in the brain" (PDF). We have come a long way since then, thanks to people like Geoffrey Hinton and his deep learning progeny.
The human brain can do about one quadrillion logical operations per second, which is known as a petaflop of computational ability (see Brains vs. Computers). These days we have become smarter and don't try to build a single massive computer (called a super computer). Instead, we connect a bunch of computers with high speed networks and call it a cluster. Clusters help us divide and conquer complex problems, thanks to algorithms like MapReduce (PDF).
Another recent advancement is the usage of graphics processing units (GPUs) for performing matrix computations, thanks to the efforts of companies like NVIDIA (see CUDA). GPUs run parallelizable tasks efficiently (embarrassingly parallel problems are also easy on the human brain, since not much coding effort is required to parallelize them). Deep learning uses matrix computations extensively, so the GPU advancement helped propel deep learning to greater heights.
Natural language processing (wiki) is another sub field of artificial intelligence, which overlaps a lot with machine learning. (It could have been better named as human language processing, to distinguish it from other languages like computer languages.)
And now, I will come to the point I want to make.
I was wondering why I don't remember my credit card number. I must have used it at least a few dozen times in the past six months, but I still can't remember it. I googled "how to remember credit card number" and came across this site, which explains "if you want to remember numbers, you've got to find the meaning in them". (That reminded me of Viktor Frankl's book "Man's search for meaning".)
Another problem is my credit card gets stolen every few years and I get a replacement card, so there isn't enough repetition/reinforcement to get the number stored in my memory. On the other hand, I have memorized my social security number over the years. It's much shorter and I haven't had a need to change it yet.
Now comes the question of whether we want artificial intelligence to learn these numbers because we think that would be cool? Do we think that a machine that can remember a number seen once is really intelligent? Or is this skill irrelevant to intelligence? I don't know the answer to that question, but it was an interesting thought, hence the post :).
Deep learning was known as neural networks when I grew up in the late 90's. The fundamental motivation of neural networks is to represent the human brain using computers (both hardware and software). The field has a long history, beginning in 1958 with Frank Rosenblatt's paper, "The Perceptron: A probabilistic model for information storage and organization in the brain" (PDF). We have come a long way since then, thanks to people like Geoffrey Hinton and his deep learning progeny.
The human brain can do about one quadrillion logical operations per second, which is known as a petaflop of computational ability (see Brains vs. Computers). These days we have become smarter and don't try to build a single massive computer (called a super computer). Instead, we connect a bunch of computers with high speed networks and call it a cluster. Clusters help us divide and conquer complex problems, thanks to algorithms like MapReduce (PDF).
Another recent advancement is the usage of graphics processing units (GPUs) for performing matrix computations, thanks to the efforts of companies like NVIDIA (see CUDA). GPUs run parallelizable tasks efficiently (embarrassingly parallel problems are also easy on the human brain, since not much coding effort is required to parallelize them). Deep learning uses matrix computations extensively, so the GPU advancement helped propel deep learning to greater heights.
Natural language processing (wiki) is another sub field of artificial intelligence, which overlaps a lot with machine learning. (It could have been better named as human language processing, to distinguish it from other languages like computer languages.)
And now, I will come to the point I want to make.
I was wondering why I don't remember my credit card number. I must have used it at least a few dozen times in the past six months, but I still can't remember it. I googled "how to remember credit card number" and came across this site, which explains "if you want to remember numbers, you've got to find the meaning in them". (That reminded me of Viktor Frankl's book "Man's search for meaning".)
Another problem is my credit card gets stolen every few years and I get a replacement card, so there isn't enough repetition/reinforcement to get the number stored in my memory. On the other hand, I have memorized my social security number over the years. It's much shorter and I haven't had a need to change it yet.
Now comes the question of whether we want artificial intelligence to learn these numbers because we think that would be cool? Do we think that a machine that can remember a number seen once is really intelligent? Or is this skill irrelevant to intelligence? I don't know the answer to that question, but it was an interesting thought, hence the post :).