HISTORY OF JERUSALEM

Jerusalem's history stretches back about 5,000 years. About 2500 BC, the Canaanites inhabited the city. Later, Jerusalem became a Jebusite citadel. When DAVID captured the city (c.1000 BC), the Jebusites were absorbed into the Jewish people. David made Jerusalem the capital of his kingdom, and SOLOMON built the first Temple to house the Ark of the Covenant. In 586 BC, the Babylonian NEBUCHADNEZZAR II destroyed Jerusalem and the Temple and exiled the Jews to Babylonia. Fifty years later (537 BC), CYRUS THE GREAT of Persia conquered Babylonia and permitted the Jews to return to Jerusalem and rebuild their Temple. Persia held the city until 333 BC, when ALEXANDER THE GREAT added Palestine to his empire. In 323 BC, PTOLEMY I of Egypt took Palestine into his kingdom

About 198 BC, the Seleucid king ANTIOCHUS III conquered Judaea (of which Jerusalem was a part), making it tributary to Syria. The Jews later revolted under the leadership of Maccabees and defeated the Syrians. The Temple was reconsecrated in 165 BC, and the Maccabean, or Hasmonean, dynasty ruled until Rome took the city in 63 BC. The Romans set up a local dynasty, the house of Herod, to rule most of Palestine; Herod the Great (r. 40-4 BC) rebuilt much of Jerusalem, including the Temple. Roman governors, however, retained ultimate control; one of them, Pontius Pilate, authorized the execution of Jesus Christ. While suppressing a major Jewish revolt, the Romans destroyed the Second Temple in AD 70. In 135, after the failure of the BAR KOCHBA revolt, Jews were banished from Jerusalem. From the early 4th century, when Christianity became legal in the Roman Empire, Jerusalem developed as a center of Christian pilgrimage. The Church of the Holy Sepulcher and many other Christian shrines were erected. Except for a brief period of Persian rule (614-28) the city remained under Roman (later, Byzantine) control until 638, when the Muslim Arabs took Jerusalem. The Arabs built (688-91) the Dome of the Rock mosque on the site of the Temple.
In the 11th century, Muslim toleration of both Jews and Christians gave way to persecution under the FATIMID caliph al-Hakim (r. 996-1021) and under the SELJUKS, who seized Jerusalem in 1071. European Christendom responded by launching the CRUSADES. The Crusaders conquered Jerusalem in 1099 and established a Crusader state. SALADIN recaptured the city for the Muslims in 1187, and the Ayyubid and Mameluke dynasties ruled until 1517, when the Ottoman Empire took control.
In 1917 the British occupied Jerusalem, and it became the capital of mandated PALESTINE from 1923 until 1948. During this period the city saw Arab rioting against the Jews. The 1948 United Nations partition plan for Palestine called for internationalization of the city. The Arabs rejected this resolution, and, from 1949, Jerusalem was divided into an Israeli and a Jordanian sector. The city remained divided until 1967, when Israel took the entire city following the Six Day War. The city is reunited today under Israeli government, which guarantees religious freedom and protection of all holy places.

HISTORY OF DISNEYLAND

Disney and his brother Roy mortgaged everything they owned to raise $17 million to build Disneyland, but fell short of what they needed. ABC-TV stepped in, guaranteeing a $6 million loan in exchange for part ownership and Disney's commitment to produce a weekly television show for them.
When the City of Burbank denied a request to build near the studio, a crucial chapter in Disneyland history began. Disney engaged Stanford Research Institute, who identified Anaheim as the center of Southern California's future growth. Disney bought 160 acres of Anaheim orange groves, and on May 1, 1954, construction began toward an impossible deadline of July, 1955, when money would run out

Opening Day: the Blackest Sunday in Disneyland History

On Sunday, July 17, 1955, invited guests arrived, and 90 million people watched via a live television broadcast. The day is still known in Disney lore as "Black Sunday," and for good reason, as a guest list of 15,000 swelled to almost 30,000 attendees. Among the many mishaps:
• Local police dubbed the seven-mile freeway backup the worst mess they had ever seen.
• Rides and attractions broke down under the unslaught of guests, opening and closing periodically to make way for television crews.
• Fantasyland closed temporarily due to a gas leak.
• Main Street's freshly-poured asphalt softened in the heat. Women wearing high heels sometimes left a shoe behind, stuck in black goo.
• Because of a plumber's strike, both restrooms and drinking fountains could not be ready by opening day. Walt opted for restrooms, leaving visitors hot and thirsty.
Most reviewers declared the park overpriced and poorly managed, expecting Disneyland history to be over almost as soon as it began.
Disneyland History: Open to the Public
The next day, 10,000 members of the general public got their first peek. On the first day of its long history, Disneyland charged visitors $1.00 admission (about $6.50 in today's dollars) to get through the gate and see three free attractions in four themed lands. Individual tickets for the 18 rides cost 10 to 35 cents each.
Walt and his staff addressed the problems, limiting daily attendance to 20,000 to avoid overcrowding. Within seven weeks, the one-millionth guest passed through the gates.
Landmark Dates in Disneyland History
"Disneyland will never be completed as long as there is imagination left in the world," Walt Disney once said. Within a year of the opening, attractions were opening, closing and changing, taking Disneyland through an evolution that still continues. A few of the more notable dates in Disneyland history include:
1959: Disneyland almost causes an international incident when U. S. officials deny Soviet Premier Nikita Krushchev a visit because of security concerns.
1959: "E" ticket introduced. The most expensive ticket, it granted access to the most exciting rides and attractions such as Space Mountain and Pirates of the Caribbean.
1963: The Enchanted Tiki Room opens, and the term "animatronics" (robotics combined with 3-D animation) is coined.
1964: Disneyland generates more money than Disney Films.
1966: Walt Disney dies.
1982: The Disneyland Ticket Book is retired, replaced by a "Passport" good for unlimited rides.
1985: Year-round, daily operation begins. Before this, the park closed Monday and Tuesday during off seasons.
1999: FASTPASS introduced.
2001: Downtown Disney, Disney's California Adventure and the Grand Californian Hotel open.
2004: Australian Bill Trow is the 500-millionth guest.

HISTORY OF KA'BAH

The small, cubed building known as the Kaba [Ka'bah] may not rival skyscrapers in height or mansions in width, but its impact on history and human beings is unmatched. The Kaba is the building towards which Muslims face five times a day, everyday, in prayer. This has been the case since the time of Prophet Muhammad (peace and blessings be upon him) over 1400 years ago.

The Size of the Kaba (Ka'bah)

The current height of the Kaba is 39 feet, 6 inches and total size comes to 627 square feet. The inside room of the Kaba is 13 x 9 meters. The Kaba's walls are one meter wide. The floor inside is 2.2 meters higher than the place where people perform Tawaf. The ceiling and roof are two levels made out of wood. They were reconstructed with teak which is capped with stainless steel. The walls are all made of stone. The stones inside are unpolished, while the ones outside are polished.

This small building has been constructed and reconstructed by Prophets Adam, Ibrahim [Abraham], Ismail [Ishmael] and Muhammad (peace be upon them all). [Prophets of Allah]. No other building has had this honor. Yet, not very much is [commonly] known about the details of this small but significant building. Did you know the Kaba was reconstructed as recently as close to four years ago? Did you know that the Kaba has been subjected to danger by natural disasters like flooding, as well as human attacks? If you didn't keep reading, you'll find some rarely heard of information discussed below and discover facts about the Kaba many are unaware of.

The Other Names of the Kaba

Literally, Kaba in Arabic means a high place with respect and prestige. The word Kaba may also be a derivative of a word meaning cube. Some of these other names include: Bait ul Ateeq which means, according to one meaning, the earliest and ancient. According to the second meaning, it means independent and liberating. Both meanings could be taken.

The History of the Kaba

Scholars and historians say that the Kaba has been reconstructed between five to 12 times. The very first construction of the Kaba was done by Prophet Adam. Allah [swt, glory be to Him] says in the Quran that this was the first house that was built for humanity to worship Allah. After this, Prophet Ibrahim and Ismail rebuilt the Kaba. The measurements of the Kaba's Ibrahimic foundation are as follows:
- the eastern wall was 48 feet and 6 inches
- the Hateem side wall was 33 feet
- the side between the black stone and the Yemeni corner was 30 feet
- the Western side was 46.5 feet
Following this, there were several constructions before Prophet Muhammad's time.
Prophet Muhammad participated in one of its reconstructions before he became a Prophet. After a flash flood, the Kaba was damaged and its walls cracked. It needed rebuilding. This responsibility was divided among the Quraish's four tribes. Prophet Muhammad helped with this reconstruction. Once the walls were erected, it was time to place the Black Stone, (the Hajar ul Aswad) on the eastern wall of the Kaba.
Arguments erupted about who would have the honor of putting the Black Stone in its place. A fight was about to break out over the issue, when Abu Umayyah, Makkah's oldest man, proposed that the first man to enter the gate of the mosque the following morning would decide the matter. That man was the Prophet. The Makkans were ecstatic. "This is the trustworthy one (Al-Ameen)," they shouted in a chorus. "This is Muhammad". He came to them and they asked him to decide on the matter. He agreed.
Prophet Muhammad proposed a solution that all agreed to putting the Black Stone on a cloak, the elders of each of the clans held on to one edge of the cloak and carried the stone to its place. The Prophet then picked up the stone and placed it on the wall of the Kaba.
Since the tribe of Quraish did not have sufficient funds, this reconstruction did not include the entire foundation of the Kaba as built by Prophet Ibrahim. This is the first time the Kaba acquired the cubical shape it has now unlike the rectangle shape which it had earlier. The portion of the Kaba left out is called Hateem now....

What is inside the Kaba?

Dr. Muzammil Siddiqi is the president of the Islamic Society of North America (ISNA). He had the opportunity to go inside the Kaba in October 1998. In an interview with Sound Vision, he described the following features:
- there are two pillars inside (others report 3 pillars)
- there is a table on the side to put items like perfume
- there are two lantern-type lamps hanging from the ceiling
- the space can accommodate about 50 people
- there are no electric lights inside
- the walls and the floors are of the marble
- there are no windows inside
- there is only one door
- the upper inside walls of the Kaba were covered with some kind of curtain with the Kalima written on it.
----------------------------------------------------------------------------------------------------------------------------------------

Allah: Allah is the proper name in Arabic for The One and Only God, The Creator and Sustainer of the universe. It is used by the Arab Christians and Jews for the God (Eloh-im in Hebrew; 'Allaha' in Aramaic, the mother tongue of Jesus, pbuh). The word Allah does not have a plural or gender. Allah does not have any associate or partner, and He does not beget nor was He begotten. SWT is an abbreviation of Arabic words that mean 'Glory Be To Him.'
s or pbuh: Peace Be Upon Him. This expression is used for all Prophets of Allah.
----------------------------------------------------------------------------------------------------------------------------------------
Courtesy: The Muslim Observer, Vol II, Issue Eleven, March 17-23, 2000.


HISTORY OF LIBERTY STATE PARK

On the New York Harbor, less than 2,000 feet from the Statue of Liberty, Liberty State Park has served a vital role in the development of New Jersey's metropolitan region and the history of the nation.

During the 19th and early 20th centuries the area that is now Liberty State Park was a major waterfront industrial area with an extensive freight and passenger transportation network. This network became the lifeline of New York City and the harbor area. The heart of this transportation network was the Central Railroad of New Jersey Terminal (CRRNJ), located in the northern portion of the park. The CRRNJ Terminal stands with the Statue of Liberty and Ellis Island to unfold one of this nation's most dramatic stories: the immigration of northern, southern, and eastern Europeans into the United States.

After being greeted by the Statue of Liberty and processed at Ellis Island, these immigrants purchased tickets and boarded trains, at the CRRNJ Terminal, that took them to their new homes throughout the United States. The Terminal served these immigrants as the gateway to the realization of their hopes and dreams of a new life in America.

Today, Liberty State Park continues to serve a vital role in the New York Harbor area. As the railroads and industry declined, the land was abandoned and became a desolate dump site. With the development of Liberty State Park came a renaissance of the waterfront. Land with decaying buildings, overgrown tracks and piles of debris was transformed into a modern urban state park. The park was formerly opened on Flag Day, June 14, 1976, as New Jersey's bicentennial gift to the nation. Most of this 1,122 acre park is open space with approximately 300 acres developed for public recreation.

HISTORY OF EIFFEL TOWER

The Eiffel Tower was built for the International Exhibition of Paris of 1889 commemorating the centenary of the French Revolution. The Prince of Wales, later King Edward VII of England, opened the tower. Of the 700 proposals submitted in a design competition, Gustave Eiffel's was unanimously chosen. However it was not accepted by all at first, and a petition of 300 names - including those of Maupassant, Emile Zola, Charles Garnier (architect of the Opéra Garnier), and Dumas the Younger - protested its construction.

At 300 meters (320.75 m including antenna), and 7,000 tons, it was the world's tallest building until 1930. Other statistics include:

* 2.5 million rivets
* 300 steel workers, and 2 years (1887-1889) to construct it.
* Sway of at most 12 cm in high winds.
* Height varies up to 15 cm depending on temperature.
* 15,000 iron pieces (excluding rivets). 40 tons of paint. 1652 steps to the top.In 1889, Gustave Eiffel began to fit the peak of the tower as an observation station to measure the speed of wind. He also encouraged several scientific experiments including Foucault's giant pendulum, a mercury barometer and the first experiment of radio transmission. In 1898, Eugene Ducretet at the Pantheon, received signals from the tower

After Gustave Eiffel experiments in the field of meterology, he begun to look at the effects of wind and air resistance, the science that would later be termed aerodynamics, which has become a large part of both military and commercial aviation as well as rocket technology. Gustave Eiffel imagined an automatic device sliding along a cable that was stretched between the ground and the second floor of the Eiffel Tower.

The tower was almost torn down in 1909, but was saved because of its antenna used both for military and other purposes, and the city let it stand after the permit expired. When the tower played an important role in capturing the infamous spy Mata Hari during World War I, it gained such importance to the French people that there was no more thought of demolishing it.- used for telegraphy at that time.

From 1910 and on the Eiffel Tower became part of the International Time Service. French radio (since 1918), and French television (since 1957) have also made use of its stature.

During its lifetime, the Eiffel Tower has also witnessed a few strange scenes, including being scaled by a mountaineer in 1954, and parachuted off of in 1984 by two Englishmen. In 1923 a journalist rode a bicycle down from the first level. Some accounts say he rode down the stairs, other accounts suggest the exterior of one of the tower's four legs which slope outward.

Of the 7.5 million kilowatt hours of electricity used annually, 580 thousand are used exclusively to illuminate the tower. The tower's annual operation also requires the use of 2 tons of paper for tickets, 4 tons of rag or paper wipes, 10,000 applications of detergents, 400 liters of metal cleansers and 25,000 garbage bags.

On the four facades of the tower, the 72 surnames of leading turn-of-the-century French scientists and engineers are engraved in recognition of their contributions to science. This engraving was over painted at the beginning of the 20th century and restored in 1986-1987 by the Société Nouvelle d' Exploitation de la Tour Eiffel, a company contracted to operate business related to the Tower.

HISTORY OF PISA TOWER

Although intended to stand vertically, the tower began leaning to the southeast soon after the onset of construction in 1173 due to a poorly laid foundation and loose substrate that has allowed the foundation to shift direction. The tower presently leans to the southwest.

The height of the tower is 55.86 m (183.27 ft) from the ground on the lowest side and 56.70 m (186.02 ft) on the highest side. The width of the walls at the base is 4.09 m (13.42 ft) and at the top 2.48 m (8.14 ft). Its weight is estimated at 14,500 metric tons (16,000 short tons). The tower has 296 or 294 steps; the seventh floor has two fewer steps on the north-facing staircase. The tower leans at an angle of 3.97 degrees. This means that the top of the tower is 3.9 metres (12 ft 10 in) from where it would stand if the tower were perfectly vertical.

Construction

The Tower of Pisa was a work of art, performed in three stages over a period of about 177 years. Construction of the first floor of the white marble campanile began on August 9, 1173, a period of military success and prosperity. This first floor is surrounded by pillars with classical capitals, leaning against blind arches.

The tower began to sink after construction progressed to the third floor in 1178. This was due to a mere three-meter foundation, set in weak, unstable subsoil. This means the design was flawed from the beginning. Construction was subsequently halted for almost a century, because the Pisans were almost continually engaged in battles with Genoa, Lucca and Florence. This allowed time for the underlying soil to settle. Otherwise, the tower would almost certainly have toppled. In 1198, clocks were temporarily installed on the third floor of the unfinished construction.

In 1272, construction resumed under Giovanni di Simone, architect of the Camposanto. In an effort to compensate for the tilt, the engineers built higher floors with one side taller than the other. This made the tower begin to lean in the other direction. Because of this, the tower is actually curved. Construction was halted again in 1284, when the Pisans were defeated by the Genoans in the Battle of Meloria.

The seventh floor was completed in 1319. The bell-chamber was not finally added until 1372. It was built by Tommaso di Andrea Pisano, who succeeded in harmonizing the Gothic elements of the bell-chamber with the Romanesque style of the tower. There are seven bells, one for each note of the musical scale. The largest one was installed in 1655.

After a phase (1990-2001) of structural strengthening, the tower is currently undergoing gradual surface restoration, in order to repair visual damage, mostly corrosion and blackening. These are particularly strong due to the tower's age and to its particular conditions with respect to wind and rain.

Timeline

* On January 5, 1172, Donna Berta di Bernardo, a widow and resident of the house of dell'Opera di Santa Maria, bequeathed sessanta soldi or "sixty coins" to the Opera Campanilis petrarum Sancte Marie. This money was to be used toward the purchase of a few stones which still form the base of the bell tower today. On August 9, 1173, the foundations of the Tower were laid.
* Nearly four centuries later Giorgio Vasari wrote : "Guglielmo, according to what is being said, in [this] year 1174 with Bonanno as sculptor, laid the foundations of the belltower of the cathedral in Pisa."
* Another possible builder is Gerardo di Gerardo. His name appears as a witness to the above legacy of Berta di Bernardo as "Master Gerardo", and as a worker whose name was Gerardo.
* A more probable builder is Diotisalvi, because of the construction period and the structure's affinities with other buildings in Pisa. But he usually signed his works, and there is no signature by him in the belltower.
* Giovanni di Simone was heavily involved in the work of completing the tower, under the direction of Giovanni Pisano, who at the time was master builder of the Opera di Santa Maria Maggiore. He could be the same Giovanni Pisano who completed the belfry tower.
* Giorgio Vasari indicates that Tommaso di Andrea Pisano was the designer of the belfry between 1360 and 1370.
* On December 27, 1233 the worker Benenato, son of Gerardo Bottici, oversaw the continuation of the construction of the belltower.
* On February 23, 1260 Guido Speziale, son of Giovanni, a worker on the cathedral Santa Maria Maggiore, was elected to oversee the building of the Tower.
* On April 12, 1264 the master builder Giovanni di Simone and 23 workers went to the mountains close to Pisa to cut marble. The cut stones were given to Rainaldo Speziale, worker of St. Francesco.

The Architect

There has been controversy about the real identity of the architect of the Leaning Tower of Pisa. For many years, the design was attributed to Guglielmo and Bonanno Pisano, a well-known 12th-century resident artist of Pisa, famous for his bronze casting, particularly in the Pisa Duomo. Bonanno Pisano left Pisa in 1185 for Monreale, Sicily, only to come back and die in his home town. His sarcophagus was discovered at the foot of the tower in 1820. However recent studies seem to indicate Diotisalvi as the original architect due to the time of construction and affinity with other Diotisalvi works, notably the bell tower of San Nicola (Pisa) and the Baptistery in Pisa. However, he usually signed his works and there is no signature by him in the bell tower which leads to further speculation.

History Following Construction

Galileo Galilei is said to have dropped two cannon balls of different masses from the tower to demonstrate that their speed of descent was independent of their mass. This is considered an apocryphal tale, and the only source for it comes from Galileo's secretary.

During World War II, the Allies discovered that the Nazis were using it as an observation post. A U.S. Army sergeant was briefly entrusted with the fate of the tower and his decision not to call in an artillery strike saved the tower from destruction.

On February 27, 1964, the government of Italy requested aid in preventing the tower from toppling. It was, however, considered important to retain the current tilt, due to the vital role that this element played in promoting the tourism industry of Pisa. A multinational task force of engineers, mathematicians and historians was assigned and met on the Azores islands to discuss stabilization methods. It was found that the tilt was increasing in combination with the softer foundations on the lower side. Many methods were proposed to stabilize the tower, including the addition of 800 metric tons of lead counterweights to the raised end of the base.

In 1987, the tower was declared as part of the Piazza dei Miracoli UNESCO World Heritage Site along with the neighbouring cathedral, baptistery and cemetery.

On January 7, 1990, after over two decades of work on the subject, the tower was closed to the public. While the tower was closed, the bells were removed to relieve some weight, and cables were cinched around the third level and anchored several hundred meters away. Apartments and houses in the path of the tower were vacated for safety. The final solution to prevent the collapse of the tower was to slightly straighten the tower to a safer angle, by removing 38 cubic metres (50 cu yd) of soil from underneath the raised end. The tower was straightened by 18 inches (45 centimetres), returning to the exact position that it occupied in 1838. After a decade of corrective reconstruction and stabilization efforts, the tower was reopened to the public on December 15, 2001, and has been declared stable for at least another 300 years.

In May 2008, after the removal of another 70 metric tons (77 short tons) of earth, engineers announced that the Tower had been stabilized such that it had stopped moving for the first time in its history. They stated it would be stable for at least 200 years.

HISTORY OF THE CHINESE GREAT WALL

The history of the Great Wall is said to start from the Spring and Autumn Periods when seven powerful states appeared at the same time. In order to defend themselves, they all built walls and stationed troops on the borders. At that time, the total length of the wall had already reached 3,107 miles, belonging to different states.

In 221 BC, The Emperor Qin absorbed the other six states and set up the first unified kingdom in Chinese history. In order to strengthen his newly born authority and defend the Huns in the north, he ordered connecting the walls once built by the other states as well as adding some sections of his own. Thus was formed the long Qin's wall which started from the east of today's Liaoning Province and ended at Lintao, Gansu Province.

In the Western Han Dynasty, the Huns became more powerful. The Han court started to build more walls on a larger scale in order to consolidate the frontier. In the west, the wall along the Hexi corridor, Yumenguan Pass, and Yangguan Pass was built. In the north, Yanmenguan Pass and Niangziguan Pass in Shanxi were set up. Many more sections of the wall extended to Yinshan Mountain and half of the ancient Silk Road was along the Han’s wall.

The Northern Wei, Northern Qi and Northern Zhou Dynasties all built their own sections but on a smaller scale than the walls in the Han Dynasty. The powerful Tang Dynasty saw peace between the northern tribes and central China most of the time so few Great Wall sections were built in this period.

The Ming Dynasty is the peak of wall building in Chinese history. The Ming suffered a lot by disturbances from minority tribes such as the Dadan, Tufan and Nuzhen. The Ming court from its first emperor to the last ceaselessly built walls in the north. The main line started from Jiuliancheng near the Yalu River in the east to the Jiayuguan Pass in the west and measured over 4,600 miles. Besides adding many more miles of its own, the Ming emperors ordered enlargement of the walls of previous dynasties into double-line or multi-line walls. For example, out of Yanmenguan Pass were added three big stone walls and 23 small stone walls. Eleven Garrisons were distributed along the main line of the wall. The countless walls, fortresses, and watch towers made the country strongly fortified. In the early Qing Dynasty, some sections of the walls were repaired and several sections were extended. This great engineering work stopped in the middle of the Qing Dynasty.

Owing to its long history, natural disasters and human activities, many sections of the Great Wall are severely damaged and disappearing. Being a world-famous engineering project and witness to the rise and fall of Chinese history, the Great Wall, needs us to take immediate action to protect it ....!!!!!

(THE WILD GREAT WALL WITH TREES ON IT WAS ONCE NOT WILD AT ALL, OVER ONE MILLION SOLDIERS GUARDED WALL AND ABOUT 2 TO 3 MILLION PEOPLE DIED FROM 1368 TO 1644 IN THE PROCESS OF BUILDING THE GREAT WALL)


HISTORY OF TELEPHONE

Credit for inventing the electric telephone remains in dispute. As with other great inventions such as radio, television, light bulb, and computer, there were several inventors who did pioneer experimental work on voice transmission over a wire and improved on each other's ideas. Innocenzo Manzetti, Antonio Meucci, Johann Philipp Reis, Elisha Gray, Alexander Graham Bell, and Thomas Edison, among others, have all been credited with pioneer work on the telephone.

The early history of the telephone is a confusing morass of claim and counterclaim, which was not clarified by the huge mass of lawsuits which hoped to resolve the patent claims of individuals. The Bell and Edison patents, however, were forensically victorious and commercially decisive.
Further information: Invention of the telephone and Elisha Gray and Alexander Bell telephone controversy

Early development
Early telephone with hand cranked generator.
• 1844 — Innocenzo Manzetti first mooted the idea of a “speaking telegraph” (telephone).
• 26 August 1854 — Charles Bourseul publishes an article in a magazine L'Illustration (Paris) : "Transmission électrique de la parole".
• 26 October 1861 — Johann Philipp Reis (1834–1874) publicly demonstrated the Reis telephone before the Physical Society of Frankfurt
• 22 August 1865, La Feuille d'Aoste reported “It is rumored that English technicians to whom Mr. Manzetti illustrated his method for transmitting spoken words on the telegraph wire intend to apply said invention in England on several private telegraph lines.”
• 28 December 1871 — Antonio Meucci files a patent caveat (n.3335) in the U.S. Patent Office titled "Sound Telegraph", describing communication of voice between two people by wire.
• 1874 — Meucci, after having renewed the caveat for two years, fails to find the money to renew it. The caveat lapses.
• 6 April 1875 — Bell's U.S. Patent 161,739 "Transmitters and Receivers for Electric Telegraphs" is granted. This uses multiple vibrating steel reeds in make-break circuits.
• 11 February 1876 — Gray invents a liquid transmitter for use with a telephone but does not build one.
• 14 February 1876 — Elisha Gray files a patent caveat for transmitting the human voice through a telegraphic circuit.
• 14 February 1876 — Alexander Bell applies for the patent "Improvements in Telegraphy", for electromagnetic telephones using undulating currents.
• 19 February 1876 — Gray is notified by the U.S. Patent Office of an interference between his caveat and Bell's patent application. Gray decides to abandon his caveat.
• 7 March 1876 — Bell's U.S. patent 174,465 "Improvement in Telegraphy" is granted, covering "the method of, and apparatus for, transmitting vocal or other sounds telegraphically … by causing electrical undulations, similar in form to the vibrations of the air accompanying the said vocal or other sound."
• 10 March 1876 — The first successful telephone transmission of clear speech using a liquid transmitter when Bell spoke into his device, “Mr. Watson, come here, I want to see you.” and Watson heard each word distinctly.
• 30 January 1877 — Bell's U.S. patent 186,787 is granted for an electromagnetic telephone using permanent magnets, iron diaphragms, and a call bell.
• 27 April 1877 — Edison files for a patent on a carbon (graphite) transmitter. The patent 474,230 was granted 3 May 1892, after a 15 year delay because of litigation. Edison was granted patent 222,390 for a carbon granules transmitter in 1879.

Early commercial instruments
Modern emergency telephone powered by sound alone
Early telephones were technically diverse. Some used a liquid transmitter, some had a metal diaphragm that induced current in an electromagnet wound around a permanent magnet, and some were "dynamic" - their diaphragm vibrated a coil of wire in the field of a permanent magnet or the coil vibrated the diaphragm. This dynamic kind survived in small numbers through the 20th century in military and maritime applications where its ability to create its own electrical power was crucial. Most, however, used the Edison/Berliner carbon transmitter, which was much louder than the other kinds, even though it required an induction coil, actually acting as an impedance matching transformer to make it compatible to the impedance of the line. The Edison patents kept the Bell monopoly viable into the 20th century, by which time the network was more important than the instrument.

Early telephones were locally powered, using either a dynamic transmitter or by the powering of a transmitter with a local battery. One of the jobs of outside plant personnel was to visit each telephone periodically to inspect the battery. During the 20th century, "common battery" operation came to dominate, powered by "talk battery" from the telephone exchange over the same wires that carried the voice signals.
Early telephones used a single wire for the subscriber's line, with ground return used to complete the circuit (as used in telegraphs). The earliest dynamic telephones also had only one opening for sound, and the user alternately listened and spoke (rather, shouted) into the same hole. Sometimes the instruments were operated in pairs at each end, making conversation more convenient but were more expensive.

At first, the benefits of an exchange were not exploited. Telephones instead were leased in pairs to the subscriber, who had to arrange telegraph contractors to construct a line between them, for example between his home and his shop. Users who wanted the ability to speak to several different locations would need to obtain and set up three or four pairs of telephones. Western Union, already using telegraph exchanges, quickly extended the principle to its telephones in New York City and San Francisco, and Bell was not slow in appreciating the potential.
Signalling began in an appropriately primitive manner. The user alerted the other end, or the exchange operator, by whistling into the transmitter.
Exchange operation soon resulted in telephones being equipped with a bell, first operated over a second wire, and later over the same wire, but with a condenser (capacitor) in series with the bell coil to allow the AC ringer signal through while still blocking DC (keeping the phone "on hook"). Telephones connected to the earliest Strowger automatic exchanges had seven wires, one for the knife switch, one for each telegraph key, one for the bell, one for the push button and two for speaking.
Rural and other telephones that were not on a common battery exchange had a magneto or hand-cranked generator to produce a high voltage alternating signal to ring the bells of other telephones on the line and to alert the operator.

In the 1890s a new smaller style of telephone was introduced, packaged in three parts. The transmitter stood on a stand, known as a "candlestick" for its shape. When not in use, the receiver hung on a hook with a switch in it, known as a "switchhook." Previous telephones required the user to operate a separate switch to connect either the voice or the bell. With the new kind, the user was less likely to leave the phone "off the hook". In phones connected to magneto exchanges, the bell, induction coil, battery and magneto were in a separate "bell box." In phones connected to common battery exchanges, the bell box was installed under a desk, or other out of the way place, since it did not need a battery or magneto.
Cradle designs were also used at this time, having a handle with the receiver and transmitter attached, separate from the cradle base that housed the magneto crank and other parts. They were larger than the "candlestick" and more popular.

Disadvantages of single wire operation such as crosstalk and hum from nearby AC power wires had already led to the use of twisted pairs and, for long distance telephones, four-wire circuits. Users at the beginning of the 20th century did not place long distance calls from their own telephones but made an appointment to use a special sound proofed long distance telephone booth furnished with the latest technology.

What turned out to be the most popular and longest lasting physical style of telephone was introduced in the early 20th century, including Bell's Model 102. A carbon granule transmitter and electromagnetic receiver were united in a single molded plastic handle, which when not in use sat in a cradle in the base unit. The circuit diagram of the Model 102 shows the direct connection of the receiver to the line, while the transmitter was induction coupled, with energy supplied by a local battery. The coupling transformer, battery, and ringer were in a separate enclosure. The dial switch in the base interrupted the line current by repeatedly but very briefly disconnecting the line 1-10 times for each digit, and the hook switch (in the center of the circuit diagram) permanently disconnected the line and the transmitter battery while the handset was on the cradle.

After the 1930s, the base also enclosed the bell and induction coil, obviating the old separate bell box. Power was supplied to each subscriber line by central office batteries instead of a local battery, which required periodic service. For the next half century, the network behind the telephone became progressively larger and much more efficient, but after the dial was added the instrument itself changed little until touch tone replaced the dial in the 1960s.

Digital telephony
The Public Switched Telephone Network (PSTN) has gradually evolved towards digital telephony which has improved the capacity and quality of the network. End-to-end analog telephone networks were first modified in the early 1960s by upgrading transmission networks with T1 carrier systems. Later technologies such as SONET and fiber optic transmission methods further advanced digital transmission. Although analog carrier systems existed, digital transmission made it possible to significantly increase the number of channels multiplexed on a single transmission medium. While today the end instrument remains analog, the analog signals reaching the aggregation point (Serving Area Interface (SAI) or the central office (CO) ) are typically converted to digital signals. Digital loop carriers (DLC) are often used, placing the digital network ever closer to the customer premises, relegating the analog local loop to legacy status.

IP telephony
Internet Protocol (IP) telephony (also known as Voice over IP), is a disruptive technology that is rapidly gaining ground against traditional telephone network technologies. In Japan and South Korea up to 10% of subscribers, as of January 2005, have switched to this digital telephone service. A January 2005 Newsweek article suggested that Internet telephony may be "the next big thing." As of 2006 many VoIP companies offer service to consumers and businesses.
IP telephony uses an Internet connection and hardware IP Phones or softphone installed on a personal computer to transmit conversations as data packets. In addition to replacing POTS (plain old telephone service), IP telephony is also competing with mobile phone networks by offering free or lower cost connections via WiFi hotspots. VoIP is also used on private wireless networks which may or may not have a connection to the outside telephone network.

Usage
By the end of 2006, there were a total of nearly 4 billion mobile and fixed line subscribers and over 1 billion Internet users worldwide. This included 1.27 billion fixed line subscribers and 2.68 billion mobile subscribers.

HISTORY OF AIRPLANE

To say simply that the Wright brothers invented the airplane would be disrespectful to the long years of scientific research and hard work put in by Orville and Wilbur Wright. Their story reads like the proverbial American dream where two honest, hardworking men, armed with nothing but their intelligence and determination made one of the most significant discoveries of the twentieth century.

Wilbur and Orville were born to Milton and Susan Wright. It was their father who initiated and encouraged the brothers’ interest in airplanes. In 1878 Milton Wright returned from a work related trip with a rubber band powered helicopter. The Wright brothers even at a young age immediately studied the model helicopter and started building replicas.

Around 1896, when the Wright brothers were successfully managing their bicycle company, the newspapers started carrying many stories about the invention of gliders and inventors who were trying to fly. This triggered the imagination of both brothers. They noticed that all the aircrafts developed till then lacked controls.

To start their venture, Wilbur wrote a letter to the Smithsonian Institution requesting for all the information on flight experiments that they had. Subsequently, in 1899 the brothers developed a simple system to warp the wings of a biplane. Warping meant that the plane could be controlled and rolled left or right as required. They tested this system on a series of gliders they developed.
The Wright brothers used Kitty Hawk, North Carolina to test the various models they built. They launched two gliders in 1900 and 1901 but were disappointed with the performance due to lack of lift and control. The brothers went back to the drawing board and spent the winter of 1901-1902 designing a wind tunnel and conducting experiments to figure out the best wing shape. This allowed them to build a glider with plenty of lift. Towards the end of 1902 they launched their third glider with roll, pitch and yaw controls.

The next winter was spent in designing a gasoline engine small and powerful enough to propel an aircraft. Their mechanic Charlie Taylor was a great help in designing the engine. They also designed the first ever airplane propellers and finally built a new, powered aircraft.
However, the road to success was not so easy. They suddenly found themselves competing with Samuel Langley, Secretary of the Smithsonian Institution. He had also built a powered aircraft and had investment funding to help his ventures. Luckily for the Wright brothers, Langley’s two attempts at launching his airplane failed miserably and put him out of competition.

Other problems were not quite so easily resolved. The weather misbehaved and there was nothing much they could do about it. Something in their control however, was the propeller. The propeller shafts broke on the first attempt and the drive sprockets were too loose on the second try. On the third try one of the propeller shafts cracked. Orville finally resolved the problem by using spring steel to make a new set of shafts. The aircraft was ready and they called it the Flyer.
After two unsuccessful attempts, the Wright brothers made aeronautical history on December 17th, 1903. Orville Wright took the Flyer for a 12 second sustained flight covering 120 feet. In the next few hours the brothers made 4 flights the longest of which was 852 feet.
Thus, the Wright brothers invented the airplane and much more!

HISTORY OF FLIGHT

Around 400 BC - Flight in China

The discovery of the kite that could fly in the air by the Chinese started humans thinking about flying. Kites were used by the Chinese in religious ceremonies. They built many colorful kites for fun, also. More sophisticated kites were used to test weather conditions. Kites have been important to the invention of flight as they were the forerunner to balloons and gliders.

Humans Try to Fly like Birds

For many centuries, humans have tried to fly just like the birds and have studied the flight of birds. Wings made of feathers or light weight wood have been attached to arms to test their ability to fly. The results were often disastrous as the muscles of the human arms are not like a birds and cannot move with the strength of a bird.

Hero and the Aeolipile
The ancient Greek engineer, Hero of Alexandria, worked with air pressure and steam to create sources of power. One experiment that he developed was the aeolipile which used jets of steam to create rotary motion. Hero mounted a sphere on top of a water kettle. A fire below the kettle turned the water into steam, and the gas traveled through pipes to the sphere. Two L-shaped tubes on opposite sides of the sphere allowed the gas to escape, which gave a thrust to the sphere that caused it to rotate. The importance of the aeolipile is that it marks the start of engine invention - engine created movement will later prove essential in the history of flight.

1485 Leonardo da Vinci - The Ornithopter and the Study of Flight.
Leonardo Da Vinci made the first real studies of flight in the 1480's. He had over 100 drawings that illustrated his theories on bird and mechanical flight. The drawings illustrated the wings and tails of birds, ideas for man carrying machines, and devices for the testing of wings. The Ornithopter flying machine was never actually created. It was a design that Leonardo da Vinci created to show how man could fly. The modern day helicopter is based on this concept. Leonardo da Vinci's notebooks on flight were reexamined in the 19th century by aviation pioneers.

1783 - Joseph and Jacques Montgolfier - The Flight of the First Hot Air Balloon
The brothers, Joseph Michel and Jacques Etienne Montgolfier, were inventors of the first hot air balloon. They used the smoke from a fire to blow hot air into a silk bag. The silk bag was attached to a basket. The hot air then rose and allowed the balloon to be lighter-than-air. In 1783, the first passengers in the colorful balloon were a sheep, rooster and duck. It climbed to a height of about 6,000 feet and traveled more than one mile.After this first success, the brothers began to send men up in hot air balloons. The first manned flight was on November 21, 1783, the passengers were Jean-Francois Pilatre de Rozier and Francois Laurent.

1799-1850's - George Cayley – Gliders
Sir George Cayley is considered the father of aerodynamics. Cayley experimented with wing design, distinguished between lift and drag, formulated the concepts of vertical tail surfaces, steering rudders, rear elevators, and air screws. George Cayley worked to discover a way that man could fly. Cayley designed many different versions of gliders that used the movements of the body to control. A young boy, whose name is not known, was the first to fly one of Cayley's gliders, the first glider capable of carrying a human.
For over 50 years, George Cayley made improvements to his gliders. Cayley changed the shape of the wings so that the air would flow over the wings correctly. Cayley designed a tail for the gliders to help with the stability. He tried a biplane design to add strength to the glider. George Cayley also recognized that there would be a need for machine power if the flight was to be in the air for a long time. George Cayley wrote "On Ariel Navigation" that showed that a fixed wing aircraft with a power system for propulsion, and a tail to assist in the control of the airplane, would be the best way to allow man to fly.

1891 Otto Lilienthal
German engineer, Otto Lilienthal, studied aerodynamics and worked to design a glider that would fly. Otto Lilienthal was the first person to design a glider that could fly a person and was able to fly long distances.
Otto Lilienthal was fascinated by the idea of flight. Based on his studies of birds and how they fly, he wrote a book on aerodynamics that was published in 1889 and this text was used by the Wright Brothers as the basis for their designs.
After more than 2500 flights, Otto Lilienthal was killed when he lost control because of a sudden strong wind and crashed into the ground.

1891 Samuel Langley
Samuel Langley was physicist and astronomer who realized that power was needed to help man fly. Langley conducted experiments using whirling arms and steam motors. He built a model of a plane, which he called an aerodrome, that included a steam-powered engine. In 1891, his model flew for 3/4s of a mile before running out of fuel.
Samuel Langley received a $50,000 grant to build a full sized aerodrome. It was too heavy to fly and it crashed. He was very disappointed. He gave up trying to fly. His major contributions to flight involved attempts at adding a power plant to a glider. He was also well known as the director of the Smithsonian Institute in Washington, DC.

1894 Octave Chanute
Octave Chanute was a successful engineer who undertook the invention of airplanes as a hobby, after being inspired by Otto Lilienthal. Chanute designed several aircraft, the Herring - Chanute biplane was his most successful design and formed the basis of the Wright biplane design. Octave Chanute published "Progress in Flying Machines" in 1894. It gathered and analyzed all the technical knowledge that he could find about aviation accomplishments. It included all of the world's aviation pioneers. The Wright Brothers used this book as a basis for much of their experiments. Chanute was also in contact with the Wright Brothers and often commented on their technical progress.

1903 The Wright Brothers - First Flight
Orville Wright and Wilbur Wright were very deliberate in their quest for flight. First, they spent many years learning about all the early developments of flight. They completed detailed research of what other early inventors had done. They read all the literature that was published up to that time. Then, they began to test the early theories with balloons and kites. They learned about how the wind would help with the flight and how it could affect the surfaces once up in the air. The next step was to test the shapes of gliders much like George Cayley did when he was testing the many different shapes that would fly. They spent much time testing and learning about how gliders could be controlled.

The Wright Brothers designed and used a wind tunnel to test the shapes of the wings and the tails of the gliders. After they found a glider shape that consistently would fly in the tests in the North Carolina Outer Banks dunes, then they turned their attention to how to create a propulsion system that would create the lift needed to fly. The early engine that they used generated almost 12 horsepower. The "Flyer" lifted from level ground to the north of Big Kill Devil Hill, at 10:35 a.m., on December 17, 1903. Orville piloted the plane which weighed six hundred and five pounds.

The first heavier-than-air flight traveled one hundred twenty feet in twelve seconds. The two brothers took turns during the test flights. It was Orville's turn to test the plane, so he is the brother that is credited with the first flight. Humankind was now able to fly! During the next century, many new airplanes and engines were developed to help transport people, luggage, cargo, military personnel and weapons. The 20th century's advances were all based on this first flight at Kitty Hawk by the American Brothers from Ohio.

HISTORY OF AIRPORT IN UNITED STATES

In the earliest years of civil aviation, no federal money went to build or operate civil landing fields. Federal money was, however, spent to map and catalog the 980 airfields in the United States that had been built by 1918 with private funds. The government's main financial support for aviation came through the purchase of military aircraft and through the military airfields that the government had constructed, especially during World War I. The government also began airmail service in 1918.

As airmail grew, the U.S. government became more involved with airports. The Post Office began investing in air stations to support the transcontinental air route in the early 1920s. The Air Commerce Act, signed by President Calvin Coolidge on May 20, 1926, made it the duty of the Secretary of Commerce to “promote air commerce.” with provisions for: the licensing, inspection, and operation of aircraft; the licensing of pilots and of mechanics engaged in aircraft work; and the operation and extension of the airways system begun by postal authorities. The Act, however, specifically barred the use of federal money for building or maintaining airports. Despite this limitation, the growth of aviation encouraged by the Act led to more private airport development.

During the Great Depression, the Federal Government began massive funding for civil works as part of its effort to create jobs and stimulate the economy. Many of these projects involved airport construction. The Civil Works Administration and later the Federal Emergency Relief Administration spent $11.5 million by the spring of 1934 on labor for 943 airport projects in small cities that established 585 new airports. Aviation regulatory agencies cooperated with these programs. Use of federal funds for constructing landing areas “reasonably necessary for use in air commerce or in the interests of national defense” continued to be allowed.

In September 1939, war broke out in Europe, prompting Congress to appropriate $40 million for Development of Landing Areas for National Defense (DLAND). Under DLAND, the Secretaries of War, Commerce, and the Navy approved expenditures for airports. By 1941, the Army Air Corps had begun directing aid to 986 airports. The Civil Aeronautics Administration (CAA) spent $363 million to construct and repair airfields in the United States, with many designed for civil aviation use after the war. Following World War II, 500 of these airports were declared surplus and turned over to cities, counties, and state sponsors to manage.

For defense purposes, CAA in 1941 extended its air traffic control system to include operation of airport towers. This function became a permanent federal responsibility in the postwar era.
In 1944, CAA submitted a National Airport Plan that helped spark Congressional interest in meeting postwar airport needs. After debating the issue, Congress passed the Federal Airport Act, signed on May 13, 1946, by President Harry S Truman. The Act provided for $500 million in grants for airport projects paid over seven years. The maximum federal grant for an eligible project would provide half of the project's costs. Local airport sponsors would issue bonds to finance the rest of the cost. All projects had to meet CAA standards for location, layout, grading, drainage, paving, and lighting. Further, all tax money collected by local governments for aviation facilities or fuel had to go for airport operations and maintenance.

In 1950, the Federal Airport Act was extended to 1958. Only runways and taxiways were eligible for federal money. Local sponsors were responsible for terminal buildings and equipment. On August 3, 1955, President Dwight Eisenhower signed Public Law 84-211, which included a new four-year program that committed $63 million of federal money each year. At the end of this period, another bill continued the money for two more years. Additional amounts were appropriated annually until 1970 when the Federal Airport Act was repealed, and the Airport and Airway Development Act of 1970, signed by President Richard Nixon on May 21, 1970, became law.

Title I of the Act provided for, among other things, $250 million annually for the “acquisition, establishment, and improvement of air navigational facilities” and security equipment for the next ten years. Title II created what was popularly called the “aviation trust fund,” financed by an eight percent tax on domestic passenger fares, a three-dollar surcharge on passenger tickets originating in the United States, a tax of seven cents per gallon on gasoline and jet fuel, a five percent tax on airfreight waybills, and an annual registration fee and charge per pound for aircraft.

On July 12, 1976, President Gerald Ford signed amendments to the law that increased the taxes levied. Now, eligible projects included snow removal, equipment to reduce aircraft noise, physical barriers and landscaping, and the purchase of land to meet environmental needs. Federal money could now pay for 90 percent of the costs of certain airport projects. This and later amendments also raised the amount of money available to airports.

By 1980, the aviation trust fund had received about $13.8 billion but only $4.1 billion had been spent on the airport system. Many parties were fighting over how the money from the fund should be spent, so most of the money remained unused. The U.S. Treasury on occasion has “tapped” the fund to use the money for projects unrelated to airports.
The Airport and Airway Improvement Act of 1982 raised taxes on aviation fuel and led to the Airport Improvement Program (AIP). It funds the construction of runways, taxiways and parts of terminal buildings and the purchase of land. It also funds automated weather observing systems, various safety-related equipment, and airport planning and noise studies. The AIP was amended several times, including in 1987, to favor small and disadvantaged businesses and individuals.

Around the world, today's airports may be operated by a national airport authority or transportation department, local authorities, airlines, private owners, or contractors. In the United States, most funds for new airports come from the sale of bonds managed by a local authority or sponsor. In the last decade, about $45 billion of AIP money has been spent with about 80 percent going to airports with scheduled air service, though like earlier programs, there continues to be disagreements and even lawsuits over how to spend the aviation trust fund.

References
Bilstein, Roger E. Flight in America, From the Wrights to the Astronauts. Revised edition. Baltimore: Johns Hopkins University Press, 1994.
Greif, Martin. The Airport Book, From Landing Field to Modern Terminal. New York: Main Street Press, Mayflower Books, 1979.
Horonjeff, Robert and McKelvey, Francis. Planning & Design of Airports. New York: McGraw-Hill Book Company, 1983.
Wells, Alexander T. Airport Planning & Management. Blue Ridge Summit, Pa.: Tab Books, 1992.
FAA Airports 50th Anniversary.” http://www.faa.gov/arp/annivers.htm.
“Milestones in Federal Aid to Airports.” http://www.faa.gov/arp/anniv01.htm.

HISTORY OF OPERA MINI

Opera Software was founded as an independent company on August 30, 1995 by Jon Stephenson von Tetchier and Geir Ivarsøy. The company was created to continue what was originally a research project at Telenor, the largest Norwegian telecommunications company.

Opera Software's first product, the Opera web browser version 2.1 for Windows, was released in 1997. Opera Software had an IPO in February 2004, and was listed on the Oslo Stock Exchange March 11, 2004.

In an attempt to capitalize on the emerging market for Internet-connected handheld devices, a project to port the Opera browser to more platforms was started in 1998. Opera 4.0, released in 2000, included a new cross-platform core that facilitated creation of editions of Opera for multiple operating systems and platforms. Up to this point, the Opera browser was trialware and had to be purchased after the trial period ended. But version 5.0 (released in 2000) saw the end of the trial period requirement. Instead, Opera became ad-sponsored, displaying advertisements to users without a license. Users could still buy licenses for several years, however. Later versions of Opera gave the user the choice of seeing banner ads or targeted text advertisements from Google.

On January 12, 2005, Opera Software announced that it would offer free licenses to higher education institutions — a change from the previous cost of $1,000 USD for unlimited licenses. Schools that opted for the free license included Massachusetts Institute of Technology (MIT), Harvard University, University of Oxford, Georgia Institute of Technology, and Duke University. Opera was commonly criticized for having been ad-sponsored, since this was seen as a barrier to gaining market share. In the newer versions the user was allowed a choice of generic graphical banners, or text-based targeted advertisements provided by Google based upon the page being viewed. Users could pay a license fee to remove the advertisement bar.
With version 8.5 (released in 2005) the advertisements were removed entirely and primary financial support came through revenue from Google (which is by contract Opera's default search engine).

The introduction in August 2005 of "Opera Mini", a Java ME based web browser for mobile phones marketed not to end users but to mobile network operators, possibly marks a new direction towards directly revenue-generating business, making the company less dependent on give-away and advertising-based Internet software. On September 20, 2005, Opera announced that it would remove the advertising from its browser and remain free of charge. Although Opera was free to download and use before this change, it had previously displayed an advertising banner unless the user purchased a license. The move was made in the hope that it would prompt more users to switch to the Opera browser. However, Opera does continue to charge for its "Opera Mobile" product which runs on many Mobile devices.

Legal issues
On May 18, 2004, Opera Software settled a lawsuit. Their statement on the Oslo Stock Exchange read:
Opera Software ASA has settled legal claims with an international corporation resulting in payment to Opera of net USD 12.75 million. The other party is not a customer of Opera and the settlement does not negatively impact future revenues. The entire amount will be booked in Q2.
Details are confidential pursuant to the settlement agreement.
It is widely theorized that the 'international corporation' named above is Microsoft, who had previously blocked Opera users from correctly viewing MSN.com (see History of the Opera web browser First MSN.com controversy and History of the Opera web browser Second MSN.com controversy).
In 2007 Opera filed a complaint against Microsoft in the European Commission, alleging that bundling Internet Explorer with Microsoft Windows is harmful to both the consumer and to other web browser companies.

HISTORY OF SATELLITE

Early conceptions

The first fictional depiction of a satellite being launched into orbit is a short story by Edward Everett Hale, The Brick Moon. The story is serialized in The Atlantic Monthly, starting in 1869. The idea surfaces again in Jules Verne's The Begum's Millions (1879).
In 1903 Konstantin Tsiolkovsky (1857–1935) published (The Exploration of Cosmic Space by Means of Reaction Devices), which is the first academic treatise on the use of rocketry to launch spacecraft. He calculated the orbital speed required for a minimal orbit around the Earth at 8 km/s, and that a multi-stage rocket fueled by liquid propellants could be used to achieve this. He proposed the use of liquid hydrogen and liquid oxygen, though other combinations can be used.

In 1928 Slovenian Herman Potočnik (1892–1929) published his sole book, Das Problem der Befahrung des Weltraums - der Raketen-Motor (The Problem of Space Travel — The Rocket Motor), a plan for a breakthrough into space and a permanent human presence there. He conceived of a space station in detail and calculated its geostationary orbit.
He described the use of orbiting spacecraft for detailed peaceful and military observation of the ground and described how the special conditions of space could be useful for scientific experiments. The book described geostationary satellites (first put forward by Tsiolkovsky) and discussed communication between them and the ground using radio, but fell short of the idea of using satellites for mass broadcasting and as telecommunications relays.
In a 1945 Wireless World article the English science fiction writer Arthur C. Clarke (1917-2008) described in detail the possible use of communications satellites for mass communications. Clarke examined the logistics of satellite launch, possible orbits and other aspects of the creation of a network of world-circling satellites, pointing to the benefits of high-speed global communications. He also suggested that thre

History of artificial satellites
The first artificial satellite was Sputnik 1, launched by the Soviet Union on 4 October 1957, and initiating the Soviet Sputnik program, with Sergei Korolev as chief designer and Kerim Kerimov as his assistant. This in turn triggered the Space Race between the Soviet Union and the United States.
Sputnik 1 helped to identify the density of high atmospheric layers through measurement of its orbital change and provided data on radio-signal distribution in the ionosphere. Because the satellite's body was filled with pressurized nitrogen, Sputnik 1 also provided the first opportunity for meteoroid detection, as a loss of internal pressure due to meteoroid penetration of the outer surface would have been evident in the temperature data sent back to Earth. The unanticipated announcement of Sputnik 1's success precipitated the Sputnik crisis in the United States and ignited the so-called Space Race within the Cold War.
Sputnik 2 was launched on November 3, 1957 and carried the first living passenger into orbit, a dog named Laika.
In May, 1946, Project RAND had released the Preliminary Design of an Experimental World-Circling Spaceship, which stated, "A satellite vehicle with appropriate instrumentation can be expected to be one of the most potent scientific tools of the Twentieth Century. The United States had been considering launching orbital satellites since 1945 under the Bureau of Aeronautics of the United States Navy. The United States Air Force's Project RAND eventually released the above report, but did not believe that the satellite was a potential military weapon; rather, they considered it to be a tool for science, politics, and propaganda. In 1954, the Secretary of Defense stated, "I know of no American satellite program."
On July 29, 1955, the White House announced that the U.S. intended to launch satellites by the spring of 1958. This became known as Project Vanguard. On July 31, the Soviets announced that they intended to launch a satellite by the fall of 1957.
Following pressure by the American Rocket Society, the National Science Foundation, and the International Geophysical Year, military interest picked up and in early 1955 the Air Force and Navy were working on Project Orbiter, which involved using a Jupiter C rocket to launch a satellite. The project succeeded, and Explorer 1 became the United States' first satellite on January 31, 1958.
In June 1961, three-and-a-half years after the launch of Sputnik 1, the Air Force used resources of the United States Space Surveillance Network to catalog 115 Earth-orbiting satellites.
The largest artificial satellite currently orbiting the Earth is the International Space Station.

Space Surveillance Network
The United States Space Surveillance Network (SSN) has been tracking space objects since 1957 when the Soviets opened the space age with the launch of Sputnik I. Since then, the SSN has tracked more than 26,000 space objects orbiting Earth. The SSN currently tracks more than 8,000 man-made orbiting objects. The rest have re-entered Earth's turbulent atmosphere and disintegrated, or survived re-entry and impacted the Earth. The space objects now orbiting Earth range from satellites weighing several tons to pieces of spent rocket bodies weighing only 10 pounds. About seven percent of the space objects are operational satellites (i.e. ~560 satellites), the rest are space debris. USSTRATCOM is primarily interested in the active satellites, but also tracks space debris which upon reentry might otherwise be mistaken for incoming missiles. The SSN tracks space objects that are 10 centimeters in diameter (baseball size) or larger.

Non-Military Satellite Services
There are three basic categories of non-military satellite services

1. Fixed Satellite Service
Fixed satellite services handle hundreds of billions of voice, data, and video transmission tasks across all countries and continents between certain points on the earth’s surface.

2. Mobile Satellite Systems
Mobile satellite systems help connect remote regions, vehicles, ships, people and aircraft to other parts of the world and/or other mobile or stationary communications units, in addition to serving as navigation systems.

3. Scientific Research Satellite (commercial and noncommercial)
Scientific research satellites provide us with meteorological information, land survey data (e.g., remote sensing), and other different scientific research applications such as earth science, marine science, and atmospheric research.

Attacks on satellites
In recent times satellites have been hacked by militant organizations to broadcast propaganda and to pilfer classified information from military communication networks.
Satellites in low earth orbit have been destroyed by ballistic missiles launched from earth. Russia, the United States and China have demonstrated the ability to eliminate satellites. In 2007 the Chinese military shot down an aging weather satellite. followed by the US Navy shooting down a defunct spy satellite in February 2008.

Jamming
Due to the low received signal strength of satellite transmissions they are prone to jamming by land-based transmitters. Such jamming is limited to the geographical area within the transmitter's range. GPS satellites are potential targets for jamming, but satellite phone and television signals have also been subjected to jamming.
Satellite Services
* Satellite Internet access
* Satellite phone
* Satellite radio
* Satellite television
* Satellite navigation

Source : http://en.wikipedia.org/

HISTORY OF HARD DISK DRIVE

The commercial usage of hard disk drives began in 1956 with the shipment of an IBM 305 RAMAC system including IBM Model 350 disk storage. For many years, hard disk drives were large, cumbersome devices, more suited to use in the protected environment of a data center or large office than in a harsh industrial environment (due to their delicacy), or small office or home (due to their size and power consumption).

Before the early 1980s, most hard disk drives had 8-inch (actually, 210 - 195 mm) or 14-inch platters, required an equipment rack or a large amount of floor space (especially the large removable-media drives, which were frequently comparable in size to washing machines), and in many cases needed high-current and/or three-phase power hookups due to the large motors they used. Because of this, hard disk drives were not commonly used with microcomputers until after 1980, when Seagate Technology introduced the ST-506, the first 5.25-inch hard drives, with a formatted capacity of 5 megabytes.

The capacity of hard drives has grown exponentially over time. With early personal computers, a drive with a 20 megabyte capacity was considered large. During the mid to late 1990s, when PCs were capable of storing not just text files and documents but pictures, music, and video, internal drives were made with 8 to 20 GB capacities. As of mid 2008, desktop hard disk drives typically have a capacity of 500 to 750 gigabytes, while the largest-capacity drives are 2 terabytes.
1950s - 1970s

The IBM 350 Disk File, invented by Reynolds Johnson, was introduced in 1956 with the IBM 305 RAMAC computer. This drive had fifty 24 inch platters, with a total capacity of five million characters. A single head assembly having two heads was used for access to all the platters, making the average access time very slow (just under 1 second).

The IBM 1301 Disk Storage Unit, announced in 1961, introduced the usage of a head for each data surface with the heads having self acting air bearings (flying heads).

The first disk drive to use removable media was the IBM 1311 drive, which used the IBM 1316 disk pack to store two million characters.

In 1973, IBM introduced the IBM 3340 "Winchester" disk drive, the first significant commercial use of low mass and low load heads with lubricated media. All modern disk drives now use this technology and/or derivatives thereof. Project head designer/lead designer Kenneth Haughton named it after the Winchester 30-30 rifle after the developers called it the "30-30" because of it was planned to have two 30 MB spindles; however, the actual product shipped with two spindles for data modules of either 35 MB or 70 MB
1980s - PC era

Internal drives became the system of choice on PCs in the 1980s. Most microcomputer hard disk drives in the early 1980s were not sold under their manufacturer's names, but by OEMs as part of larger peripherals (such as the Corvus Disk System and the Apple ProFile). The IBM PC/XT had an internal hard disk drive, however, and this started a trend toward buying "bare" drives (often by mail order) and installing them directly into a system.

External hard drives remained popular for much longer on the Apple Macintosh and other platforms. Every Mac made between 1986 and 1998 has a SCSI port on the back, making external expansion easy; also, "toaster" Compact Macs did not have easily accessible hard drive bays (or, in the case of the Mac Plus, any hard drive bay at all), so on those models, external SCSI disks were the only reasonable option.
Timeline
1950s thru 1990s

Five Decades Of Disk Drive Industry Firsts maintained by Disk/Trend an HDD industry marketing consultancy.
1980s to present day

* 1980 - The world's first gigabyte-capacity disk drive, the IBM 3380, was the size of a refrigerator, weighed 550 pounds (about 250 kg), and had a price tag of $40,000.
* 1986 - Standardization of SCSI
* 1989 - Jimmy Zhu and H. Neal Bertram from UCSD proposed exchange decoupled granular microstructure for thin film disk storage media, still used today.
* 1991 - 2.5-inch 100 megabyte hard drive
* 1991 - PRML Technology (Digital Read Channel with 'Partial Response Maximum Likelihood' algorithm)
* 1992 - first 1.3-inch hard disk drive - HP Kitty hawk
* 1994 - IBM introduces Laser Textured Landing Zones (LZT)
* 1996 - IBM introduces GMR (Giant MR) Technology for read sensors
* 1998 - UltraDMA/33 and ATAPI standardized
* 1999 - IBM releases the Micro drive in 170 MB and 340 MB capacities
* 2002 - 137 GB addressing space barrier broken
* 2003 - Serial ATA introduced
* 2005 - First 500 GB hard drive shipping (Hitachi GST)
* 2005 - Serial ATA 3G standardized
* 2005 - Seagate introduces Tunnel MagnetoResistive Read Sensor (TMR) and Thermal Spacing Control
* 2005 - Introduction of faster SAS (Serial Attached SCSI)
* 2005 - First Perpendicular recording HDD shipped: Toshiba 1.8-inch 40/80 GB2006 - First 750 GB hard drive (Seagate)
* 2006 - First 200 GB 2.5" hard drive utilizing Perpendicular recording (Toshiba).
* 2006 - Fujitsu develops heat-assisted magnetic recording (HAMR) that could one day achieve one terabit per square inch densities. 2007 - First 1 terabyte hard drive (Hitachi GST)
* 2008 - First 1.5 terabyte hard drive (Seagate)
* 2009 - First 2.0 terabyte hard drive (Western Digital)

HISTORY OF AMD

1969
AMD incorporates with $100,000; establishes headquarters in Sunnyvale, California

1970
AMD introduces its first proprietary device: the Am2501 logic counter


1972
AMD goes public

1979
Production begins in new AMD Austin manufacturing facility

1979
AMD debuts on the New York Stock Exchange

1982
At IBM's request, AMD signs an agreement to serve as a second source to Intel for IBM
PC microprocessors

1984
AMD is listed in "The 100 Best Companies to Work for in America

1985
AMD is listed in Fortune 500 for the first time ATI incorporates
ATI develops its first graphics controller and first graphics board product

1986
ATI secures major contract with Commodore Business Machines to supply 7000 chips per week

1987
AMD acquires Monolithic Memories, Inc. and enters programmable logic business
ATI debuts EGA Wonder™ and VGA Wonder

1988
Work begins on AMD Submicron Development Center

1989
ATI assists in establishment of VESA standard for graphics industry

1991
AMD's Am386® microprocessor family debuts
ATI introduces Mach8™ chip and board products: first ATI products to process graphics independently of the CPU

1992
ATI introduces Mach32™: first ATI integrated graphics controller and accelerator in one chip
ATI releases VESA Local Bus (VLB) products, followed by peripheral component interconnect (PCI_) products
ATI establishes ATI GmbH in Munich, Germany

1993
AMD Am486® microprocessor family debuts
AMD establishes joint venture with Fujitsu to produce Flash memory products
ATI goes public; stocks are listed on NASDAQ and Toronto Stock Exchange

1994
AMD and Compaq Computer Corp. form long-term alliance to power Compaq computers with Am486 microprocessors
ATI introduces Mach64™: first ATI graphics boards to accelerate motion video

1995
AMD introduces AMD-K5® microprocessor: first independently-designed, socket-compatible x86 microprocessor
ATI is first graphics company to ship Mac-compatible graphics boards

1996
AMD acquires NexGen, a microprocessor company
ATI releases industry's first 3D graphics chip, first combination graphics and TV tuner card, and first chip to display computer graphics on a television
ATI enters the notebook market with the industry's first notebook 3D graphics accelerator
ATI establishes ATI Ireland

1997
AMD introduces the AMD-K6® microprocessor: helps drive PC prices below $1,000 for the first time, making PCs affordable to average consumers
ATI is first graphics company to provide hardware support for DVD acceleration and display
ATI is first graphics company to release products supporting Accelerated Graphics Port, the new industry standard

1998
ATI is first company to introduce a complete set-top box design
ATI ships its ten millionth AGP chip

1999
AMD Athlon™ processor becomes first seventh-generation processor for Microsoft® Windows® computing
Vantis, AMD's programmable logic business, sold to Lattice Semiconductor

2000
AMD is first to break the historic 1GHz (one billion clock cycles per second) with the AMD Athlon™ processor
AMD introduces AMD PowerNow!™ technology with Mobile AMD-K6®-2+ processors
ATI Radeon™ graphics technology debuts: leading product for high-end gaming and 3D workstations
ATI acquires ArtX, Inc., a graphics chipset company

2001
AMD Athlon™ MP processor debuts: the company's first multiprocessing platform
AMD HyperTransport™ technology is adopted by Agilent, Apple Computer, Broadcom, Cisco Systems, IBM, nVidia, Sun, and Texas Instruments

2002
AMD acquires Alchemy Semiconductor for low-power, embedded processor technology
AMD Cool'n'Quiet™ technology debuts with Athlon™ XP family: helps lower power consumption, enables quieter-running system, and delivers performance on-demand to maximize users' computing experience
ATI launches ATI Radeon™ 9700 Pro: world's first DirectX 9 graphics processor

2003
AMD and IBM sign joint manufacturing technology development agreement to develop future generation manufacturing technologies
AMD Opteron™ processor and AMD Athlon™ 64 processor debut
With Fujitsu, AMD forms FASL, LLC, and a new company: Spansion™
AMD forms strategic alliance with Sun Microsystems and acquires National Semiconductor's x86 business
ATI introduces ATI Radeon™ 9600 XT: world's first high volume 0.13um low-k chips

2004
AMD demonstrates world's first x86 dual-core processor
AMD announces the 50x15 Initiative with the goal of accelerating affordable Internet access and basic computing to 50 percent of the world's population by 2015
Advanced Micro Devices (China) Co., Ltd. is established, headquartered in Beijing
ATI is listed in the NASDAQ 100
ATI introduces first 110nm GPUs (ATI Radeon™ X800 XL)

2005
AMD introduces AMD Turion™ 64 mobile technology for notebook PCs and AMD Athlon™ 64 X 2 dual-core processor for desktop
AMD introduces the world's highest performing processors for 1-8P x86 servers and workstations
AMD files landmark antitrust litigation against Intel for illegally abusing its monopoly to exclude and limit competition
Spansion™ goes public
AMD announces grand opening of Fab 36 in Dresden, Germany
ATI GPU is featured in Microsoft Xbox 360, revolutionizing high-definition gaming

2006
AMD acquires ATI to create a new, innovative processing powerhouse CrossFire™ multi-GPU gaming platform debuts
AMD LIVE!™ media center PCs debut
Dell Inc. announces it will offer AMD processor-based systems
AMD begins revenue shipments of processors from Fab 36
AMD's Shanghai Research and Development Center (SRDC) launches to focus on the development of AMD's next-generation mobile platforms
AMD demonstrates the industry's first native quad-core x86 server processor
AMD is a founding member of The Green Grid, an open, global organization designed to decrease IT facility energy usage patterns

2007
AMD demonstrates Accelerated Computing platform that breaks teraflop performance barrier
AMD introduces ATI Radeon™ HD 2000 series graphics processors to deliver The Ultimate Visual Experience™ graphics for desktop and mobile platforms