Saturday, April 7, 2012

What Is the Smallest Planet In the Solar System?

The smallest planet in the solar system is the planet Mercury.

Also the closest planet to the sun, Mercury is about 20 times smaller than Earth and orbits the Sun once every 87.969 Earth days.
Although bright, Mercury is not easily seen from Earth as it is normally lost in the glare of the Sun. The first spacecraft to visit the planet was Mariner 10 in 1974.
Mercury has no moon, no atmosphere and its surface looks very similar to our moon with lots of craters. It is also the densest planet in the solar system because Mercury’s core contains more iron than any of the other planets.
One theory to explain this suggests that most of the surface rock on Mercury was vaporized when the Sun was formed.
Because of the large amount of iron in its core, Mercury has a very strong magnetic field which is powerful enough to deflect the solar wind around the planet, creating a magnetosphere.
Here’s a bonus factoid; Pluto used to be the smallest planet in the solar system until it was no longer classified as a planet in 2006.

Who Invented Chocolate and Where Does Chocolate Come From?

Chocolate comes from the cacao tree which has been cultivated since 1100 BC in Mexico, Central and South America.

We now know that the Aztecs made chocolate beverages called xocolātl, which means “bitter water” in Nahuatl. Chocolate was used in Maya and Aztec royal and religious rituals, and the oldest known cultivation of cacao was discovered at a site in Puerto Escondido, Honduras.
Chocolate was an important luxury good throughout the Aztec empire, and cocoa beans were even used as currency because cacao could not be grown locally.
Eventually, Chocolate drinks were introduced in Europe in the 16th century, and the Europeans added cane sugar to counteract the natural bitterness. It was served cold and became very popular in Spain after the Spanish conquest of the Aztecs.
In the 1800s, chocolate in solid form was invented, and the British family Fry claims to have marketed the first ever solid chocolate bar in 1846.
During the Industrial Revolution, mechanical mills were created that squeezed out cocoa butter, which helped simply mass production of chocolate. Many people contributed to the modern Chocolate making process including van Houten, Henri Nestlé, and Daniel Peter who first used milk powder to create the first milk chocolate in 1875.
These days, most of the global supply of cocoa comes from Western Africa and South America.
Besides being scrumptious, many people believe that Chocolate has health benefits because of the antioxidants it contains, and that’s a bonus if you’re a fan of the dark stuff.

Sunday, April 1, 2012

Who Invented the Computer Device Called a Mouse and When?

The mouse is a great piece of technology many of us use everyday, and was destined for cuteness, not boring drivel.
It was invented in the late 1960s by Douglas Engelbart, and was called the “XY Position Indicator for a Display System.” It was a catchy name, but early users quickly came up with a nickname of their own: “turtle.”
The animal motif continued with the more accurate description, “rodent,” that followed.
With its long hairless tail, the little pointer device resembled any number of creatures in the order Rodentia, and these early technogeeks didn’t want to discriminate.
However, good sense and a desire to sell to consumers took over, and the name was changed to “mouse,” which stuck permanently. In every dictionary on our shelves we found mouse, which reads like this Merriam-Webster Dictionary on-line entry:
“mouse: a small mobile manual device that controls movement of the cursor and selection of functions on a computer display.”
Now it’s your turn to come up with a name for touch pads.

Who Invented the First Computer Mouse and How Did It Get Its Name?

The first computer mouse was given its name by the device’s inventor, Douglas Engelbart.
Because, for some reason, the “X-Y Position Indicator for a Display System” just didn’t catch on.
The first users hated the clumsiness of the name and quickly dubbed them “turtles,” which eventually became “rodent,” which morphed into the cuter-sounding “mouse.”
This name was just right for the shape and size of the X-Y Position Indicator, and it stuck.
These days, mice are sophisticated devices that have many buttons. Some are cordless, and more recently, have no balls for tracking. Instead, a low power laser is used to track the surface it sits on.
The advantage? We no longer have to remove the ball from the poor mouse and clean the rollers so our mouse works like new again.

Why is a Computer Problem Called a “Bug” And Where Did the Term Originate?

Grace Hopper led the team that developed the first large-scale computer for the American Navy in 1945.
After troubleshooting an unexplained problem for days, they finally found the cause to be a two-inch bug, a moth, that had gotten stuck in the relay system.
And that’s how the term bug was coined.
From then on, all unexplained computer problems were called bugs.

Which Is Smarter Your Brain or a Computer?

A computer is a complicated electronic or mechanical machine invented by man to solve difficult problems. But a computer can only carry out instructions that the human brain gives it.
The computer has no ability to think or reason for itself. However, it could take a man a lifetime to solve some of the problems that a computer can solve in only minutes. So, even though the human brain is more complicated and efficient than any computer, the computer is faster.

What Was Considered a Computer Before the Electronic Age and How Much Did It Cost?

The word computer first appeared in the seventeenth century as the job title of a person who did calculations as an occupation.

Most human computers weren’t paid very much, just like the Information Technology workers of today, even though their job descriptions are slightly different.
Although slide rules were sometimes called computers, it wasn’t until the 1940s, with the development of massive electronic data machines, that the human occupation of computing became obsolete.
These mechanical devices became known as computers.

How Was the Computer Invented and When?

Imagine the world without computers. You’d have no internet to help you with your homework, and just think of all that post, photocopying and filing.
The first computer was invented in 1834, long before TV, telephones and even electricity. The inventor, Charles Babbage, called his computer the Difference Engine. It was essentially a calculator and since there was no electricity, it was mechanical. And very complicated.
In fact he never quite managed to finish it. Babbage spent a further 37 years designing the Analytical Engine, a pre-cursor to the first working general-purpose computers.
Computers really got going in the 20th century: Konrad Zuse’s Z3 was the first programmable computer, invented in 1941.
The Colossus computer was the first completely electronic computer. It was used to crack German codes during the Second World War.
By the end of the 1950s computers had become smaller and cheaper. (They were still about the size of a double-decker bus, though.)
Microprocessors, programmable components measuring just a few millimetres, were invented by Intel employee Ted Hoff in 1971. Just one of them was as powerful as the huge 30-tonne computers of the 1940s.
By the 1980s computers were small and cheap enough for individuals to buy and use at home. And today, computers are everywhere and most modern electronic devices, from washing machines to cars, contain one.
Finished Engine: Although Charles Babbage never completed the Difference Engine himself, the Science Museum in London did manage to build one in 1991, to mark the 200th anniversary of Babbage’s birth. And it worked!
Computers have taken over almost all aspects of our lives, from schoolwork and homework to keeping in touch with your friends. But not so long ago computers didn’t even exist.
Can you live without using one for at least a week? Sign the declaration and see if you have the will power to give it up.

Who Invented the First Computer?

The answer to this question depends of your definition of a computer.
The first known counting devices or tools were Tally Sticks from about 35,000 BC.
The Abacus was then invented by the Babylonians in 2400 BC.
In 1837, Charles Babbage, a British professor of mathematics described his idea for the Analytical Engine, the first stored-program mechanical computer. The Analytical Engine was designed to be powered by a steam engine and was to use Punched Cards, which was used to program mechanical looms at the time.
What made the Analytical Engine unique was that it was designed to be programmed.
It was because of this and the fact that it would be more than 100 years that any similar devices would be constructed, Charles Babbage, would be considered by many as the “father of computing”. Because of legal, financial, and political obstacles, the Analytical Machine would never be completed. Charles Babbage was also difficult to work with and alienated the supporters of his work.
In 1939, John V. Atanasoff and Clifford Berry developed the Atanasoff-Berry Computer (ABC) at Iowa State University, which was regarded as the first electronic digital computer. The ABC was built by hand and the design used over 300 Vacuum Tubes and had capacitors fixed in a mechanically rotating drum for memory.
The ENIAC (Electronic Numerical Integrator and Computer), constructed in the US in 1943, is widely regarded as the first functionally useful electronic general-purpose computer. Influenced by the ABC, it was a turning point in the history of computing and was used to perform ballistics trajectory calculations and used 160 kW of power. World War II is known to be the driving force of computing hardware development and one of such use of computers was in communications encryption and decryption.
The UNIVAC I (Universal Automatic Computer) was the first commercially available, “mass produced” electronic computer manufactured by Remington Rand in the USA and was delivered to the US Census Bureau in June 1951. It used 5,200 vacuum tubes and consumed 125 kW of power. 46 machines were sold at more than $1 million each.
The microprocessor eventually led to the development of the microcomputer, small, low-cost computers that individuals and small businesses could afford.
By the 1990s, the microcomputer or Personal Computer (PC) became a common household appliance, and became even more widespread with the advent of the Internet.