Martes, Hulyo 29, 2014

History of Programming Languages


The first programming languages predate the modern computer.
During a nine-month period in 1842-1843, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea about Charles Babbage's newest proposed machine, the Analytical Engine. With the article she appended a set of notes which specified in complete detail a method for calculating Bernoulli numbers with the Analytical Engine, recognized by some historians as the world's first computer program.
Herman Hollerith realized that he could encode information on punch cards when he observed that train conductors encode the appearance of the ticket holders on the train tickets using the position of punched holes on the tickets. Hollerith then encoded the 1890 census data on punch cards.
The first computer codes were specialized for their applications. In the first decades of the 20th century, numerical calculations were based on decimal numbers. Eventually it was realized that logic could be represented with numbers, not only with words. For example, Alonzo Church was able to express the lambda calculus in a formulaic way. The Turing machine was an abstraction of the operation of a tape-marking machine, for example, in use at the telephone companies. Turing machines set the basis for storage of programs as data in the von Neumann architecture of computers by representing a machine through a finite number. However, unlike the lambda calculus, Turing's code does not serve well as a basis for higher-level languages—its principal use is in rigorous analyses of algorithmic complexity.
Like many "firsts" in history, the first modern programming language is hard to identify. From the start, the restrictions of the hardware defined the language. Punch cards allowed 80 columns, but some of the columns had to be used for a sorting number on each card. FORTRAN included some keywords which were the same as English words, such as "IF", "GOTO" (go to) and "CONTINUE". The use of a magnetic drum for memory meant that computer programs also had to be interleaved with the rotations of the drum. Thus the programs were more hardware-dependent.
To some people, what was the first modern programming language depends on how much power and human-readability is required before the status of "programming language" is granted. Jacquard looms and Charles Babbage's Difference Engine both had simple, extremely limited languages for describing the actions that these machines should perform. One can even regard the punch holes on a player piano scroll as a limited domain-specific language, albeit not designed for human consumption.

First Programming languages

In the 1940s, the first recognizably modern electrically powered computers were created. The limited speed and memory capacity forced programmers to write hand tuned assembly language programs. It was eventually realized that programming in assembly language required a great deal of intellectual effort and was error-prone.

The first programming languages designed to communicate instructions to a computer were written in the 1950s. An early high-level programming language to be designed for a computer was Plankalkül, developed for the German Z3 by Konrad Zuse between 1943 and 1945. However, it was not implemented until 1998 and 2000.
John Mauchly's Short Code, proposed in 1949, was one of the first high-level languages ever developed for an electronic computer. Unlike machine code, Short Code statements represented mathematical expressions in understandable form. However, the program had to be translated into machine code every time it ran, making the process much slower than running the equivalent machine code.

The Manchester Mark 1 ran programs written in Autocode from 1952.
At the University of Manchester, Alick Glennie developed Autocode in the early 1950s. A programming language, it used acompiler to automatically convert the language into machine code. The first code and compiler was developed in 1952 for the Mark 1 computer at the University of Manchester and is considered to be the first compiled high-level programming language.
The second autocode was developed for the Mark 1 by R. A. Brooker in 1954 and was called the "Mark 1 Autocode". Brooker also developed an autocode for the Ferranti Mercury in the 1950s in conjunction with the University of Manchester. The version for the EDSAC 2 was devised by D. F. Hartley of University of Cambridge Mathematical Laboratory in 1961. Known as EDSAC 2 Autocode, it was a straight development from Mercury Autocode adapted for local circumstances, and was noted for its object code optimisation and source-language diagnostics which were advanced for the time. A contemporary but separate thread of development, Atlas Autocode was developed for the University of Manchester Atlas 1 machine.
Another early programming language was devised by Grace Hopper in the US, called FLOW-MATIC. It was developed for theUNIVAC I at Remington Rand during the period from 1955 until 1959. Hopper found that business data processing customers were uncomfortable with mathematical notation, and in early 1955, she and her team wrote a specification for an English programming language and implemented a prototype. The FLOW-MATIC compiler became publicly available in early 1958 and was substantially complete in 1959. Flow-Matic was a major influence in the design of COBOL, since only it and its direct descendent AIMACO were in actual use at the time. The language Fortran was developed at IBM in the mid 1950s, and became the first widely used high-level general purpose programming language.
Other languages still in use today, include LISP (1958), invented by John McCarthy and COBOL (1959), created by the Short Range Committee, heavily influenced by Grace Hopper. Another milestone in the late 1950s was the publication, by a committee of American and European computer scientists, of "a new language for algorithms"; the ALGOL 60 Report (the "ALGOrithmic Language"). This report consolidated many ideas circulating at the time and featured three key language innovations:
  • nested block structure: code sequences and associated declarations could be grouped into blocks without having to be turned into separate, explicitly named procedures;
  • lexical scoping: a block could have its own private variables, procedures and functions, invisible to code outside that block, that is, information hiding.
Another innovation, related to this, was in how the language was described:
  • a mathematically exact notation, Backus-Naur Form (BNF), was used to describe the language's syntax. Nearly all subsequent programming languages have used a variant of BNF to describe the context-free portion of their syntax.
Algol 60 was particularly influential in the design of later languages, some of which soon became more popular. The Burroughs large systems were designed to be programmed in an extended subset of Algol.
Algol's key ideas were continued, producing ALGOL 68:
  • syntax and semantics became even more orthogonal, with anonymous routines, a recursive typing system with higher-order functions, etc.;
  • not only the context-free part, but the full language syntax and semantics were defined formally, in terms of Van Wijngaarden grammar, a formalism designed specifically for this purpose.
Algol 68's many little-used language features (for example, concurrent and parallel blocks) and its complex system of syntactic shortcuts and automatic type coercions made it unpopular with implementers and gained it a reputation of being difficult. Niklaus Wirth actually walked out of the design committee to create the simpler Pascal language.
Some important languages that were developed in this period include:
.* Regional assembly language
*Autocode
*IPL
*Flow-Matic
*Fortran
*Comtran
*LISP
*Algol 58
*Fact
*COBOL
*RPG
*APL
*SIMULA
*SNOBOL
*CPL
*BASIC
*PL/I
*JOSS
*BCPL

Reference site : http://en.wikipedia.org/wiki/History_of_programming_languages




Lunes, Hulyo 28, 2014

History of Internet


The history of the Internet begins with the development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, Great Britain, and France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol.) The first message was sent over the ARPANET from computer science Professor Leonard Kleinrock's laboratory at University of California, Los Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).
Packet switching networks such as ARPANET, Mark I at NPL in the UK, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of communications protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined into a network of networks.
Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network(CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. Commercial Internet service providers (ISPs) began to emerge in the late 1980s. The ARPANET was decommissioned in 1990. Private connections to the Internet by commercial entities became widespread quickly, and the NSFNET was decommissioned in 1995, removing the last restrictions on the use of the Internet to carry commercial traffic.
Since the mid-1990s, the Internet has had a revolutionary impact on culture and commerce, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, and the World Wide Web with its discussion forums, blogs, social networking, and online shopping sites. The research and education community continues to develop and use advanced networks such as NSF's very high speed Backbone Network Service(vBNS), Internet2, and National LambdaRail. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1-Gbit/s, 10-Gbit/s, or more. The Internet's takeover of the global communication landscape was almost instant in historical terms: it only communicated 1% of the information flowing through two-way telecommunications networks in the year 1993, already 51% by 2000, and more than 97% of the telecommunicated information by 2007.[1] Today the Internet continues to grow, driven by ever greater amounts of online information, commerce, entertainment, and social networking.
Reference site: http://en.wikipedia.org/wiki/History_of_the_Internet

History of computer system




The history of computer system dated back to the 17th century. Though the term computer was not used then. The first mathematical or mechanical device that was used for computation of data was found among the Chinese before the Birth of Christ called Abacus. This was beads stung on wires used for arithmetic calculation. Abacus is still been used in certain circles even to this day in China.
However the first time a device capable of performing arithmetic computation was designed around 1840 by Charles Babbage. Though he called it “analytical engine”,the concept that underlies its design underscores what saw to the emergence of what is called computer today.
These components were: A storage system for data, an arithmetic unit, a control unit, an input device and an output device. These same factors defined the use of computer today. Babbage envisioned that this will solved a variety of problems which was then achieved as enumerated by what the machine could perform.
A succinct history of Computer systems can be categorised into generations as follows:
First generation Computers
These set of computers made their first appearance between 1940 and 1958. They were very large in size perhaps the size of a room and very expensive to use and maintain. The main memory of these computers was a vacuum tube. Punch cards and magnetic tapes were used as the source of input and for the storage of data. The output was via the print out. They were general purpose computers and they can only perform one function at a time. Examples of some first generations computers are:
Electronic Numerical Integrator and Computer (ENIAC)
Electronic Delay Storage Automatic Computer (EDSAC)
Universal Automatic Computer (UNIVAC)
Second Generation Computers
The second generation history of computer system is traceable between 1959 and 1966. The memory of these computers uses transistors and magnetic tape to store data. Transistors provided faster operations and generate lesser heat. Early version of high level programming language such as COBOL and FORTRAN were developed at this time.
Example of computer in this generation was the IBM 1401 developed in 1959. This promoted commercialised computer data processing to a higher degree for the first time. And this off course made IBM 1401 very successful business computer.
Third Generation Computers
The third generation set of computers' history dated back to between 1964 and 1970. The memories of these sets of computers were made from silicon chips transformed into tiny miniaturised Integrated circuits.
This provided vast internal storage and operated in billionth of a second. Secondary storage magnetic disks were introduced. This solved the problem associated with magnetic tapes in terms of slowness and sequential access to data. The integrated circuits (ICs) made it possible for the computer to run more than one program at the same time. The IBM 360 series is an example of a very successful third generation computer.
Fourth Generation Computers
The fourth generation computers made their appearance between 1971 and 1990. These were the first set of computers that use large scale Integrated circuits (LSIC). The memory of the computers logic circuits that perform logical operations were constituted by these large scale Integrated circuits.
This was the era that birth the invention of the micro processor which provided enormous processing speed. Example of this type of processor was the Intel-4004 which performs about 1 million multiplications per second. The Intel- 4004 was manufactured by the Intel Corporation in USA and it carried 2250 on a tiny silicon chip. Micro processor when integrated with the Input and Output system of a computer saw the era of the emergence of Micro computers.
Fifth Generation Computers
The fifth generation history of computer system is an era of improvement on micro computers which in itself was seen as one of the greatest breakthrough in technology in the 20th century. This started from 1991 into the future. This is popularly referred to as advancement in artificial intelligence. This process will make computer imitate human intelligence. This includes speech activated computers that have the ability to respond to natural language.
The world of Computer is in the state of flux and it will continue into the future. Computer users and sellers most necessarily seek for latest updates to remain relevant and make the utmost of these emerging breakthroughs in technology and most especially in “artificial intelligence” as it is popularly called.
Do you have any latest findings with respect to the history of computer system? Feel free to give your contribution relative to how it has impact Man's ways of life.
Referrence site: http://infotechnology.hubpages.com/hub/History-of-Computer-System

About me

I'm Holden Atadero 18 years of age. I was born in Koronadal and currently living in Baldoza LaPaz with my Grandparents. I graduated in Barotac Viejo National Highschool in year 2013. My hobbies are playing computer games, drawing, watching anime and reading manga. I also started collecting some stuffs like anime merchandise.  I was also able to move in australia together with my mother for atleast 3 months and return in the philippines by myself cause I can't stand the loneliness there. I also played basketball there as a past time hobby together with other filipinos in there who were actually the sons of the friend of my tita. My favorite foods
are boiled egg, burger, squash, potatoes and spaghetti. I also love downloading pc and psp games if the internet is fast. I also love cooking when I have a spare time. Most of the time, I eat in a fastfood chain though. I'm not a very athletic person but I can do better work than others when it comes to an indoorjobs. Paintings about nature make feel at ease too. And as for my parents, my Father died when I was 12 years old while my Mother is currently living in australia.