After World War 2, with the development of the transistor, scientists discovered they could do more things with small amounts of electricity, using components called transistors.
Transistors switch electronic signals and by combining them, you can make them remember numbers, or compare numbers or add up.
Transistors led to the development of the computer.
During the 1960s, the first Integrated Circuits (also known as microchips) appeared. These were made of complicated collections of transistors and resistors.
Gradually, they were developed into central processing units that run computers, smartphones and tablets.
To give you an idea, in 1951, a computer could perform 2000 mathematical instructions per second
By 1974, this had risen to 330,000 instructions per second.
By 2013, an Intel i7 could do 120,300,000,000 instructions per second.
Before the 1960s, electronics ran on vacuum tubes, large cumbersome glass objects that did the same thing as one tiny transistor.
You would need 700 million of these for one Intel i7 CPU.