Saturday, July 18, 2009

Software engineering

Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software.[1]

The term software engineering first appeared in the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the current "software crisis" at the time.[2] [3] Since then, it has continued as a profession and field of study dedicated to creating software that is of higher quality, more affordable, maintainable, and quicker to build. Since the field is still relatively young compared to its sister fields of engineering, there is still much debate around what software engineering actually is, and if it conforms to the classical definition of engineering. It has grown organically out of the limitations of viewing software as just programming. Software development is a term sometimes preferred by practitioners[who?] in the industry who view software engineering as too heavy-handed and constrictive to the malleable process of creating software.[citation needed] Although software engineering is a young profession, the field's future looks bright as Money Magazine and Salary.com rated software engineering as the best job in America in 2006.

Thursday, July 16, 2009

Privacy Policy

Privacy Policy for http://thecomputer-science.blogspot.com/

If you require any more information or have any questions about our privacy policy, please feel free to contact us by email at radenayu85@gmail.com.

At http://thecomputer-science.blogspot.com/, the privacy of our visitors is of extreme importance to us. This privacy policy document outlines the types of personal information is received and collected by http://thecomputer-science.blogspot.com/ and how it is used.

Log Files
Like many other Web sites, http://thecomputer-science.blogspot.com/ makes use of log files. The information inside the log files includes internet protocol ( IP ) addresses, type of browser, Internet Service Provider ( ISP ), date/time stamp, referring/exit pages, and number of clicks to analyze trends, administer the site, track user’s movement around the site, and gather demographic information. IP addresses, and other such information are not linked to any information that is personally identifiable.

Cookies and Web Beacons
http://thecomputer-science.blogspot.com/ does use cookies to store information about visitors preferences, record user-specific information on which pages the user access or visit, customize Web page content based on visitors browser type or other information that the visitor sends via their browser.

DoubleClick DART Cookie
.:: Google, as a third party vendor, uses cookies to serve ads on http://thecomputer-science.blogspot.com/.
.:: Google's use of the DART cookie enables it to serve ads to users based on their visit to http://thecomputer-science.blogspot.com/ and other sites on the Internet.
.:: Users may opt out of the use of the DART cookie by visiting the Google ad and content network privacy policy at the following URL - http://www.google.com/privacy_ads.html

Some of our advertising partners may use cookies and web beacons on our site. Our advertising partners include ....
Google Adsense


These third-party ad servers or ad networks use technology to the advertisements and links that appear on http://thecomputer-science.blogspot.com/ send directly to your browsers. They automatically receive your IP address when this occurs. Other technologies ( such as cookies, JavaScript, or Web Beacons ) may also be used by the third-party ad networks to measure the effectiveness of their advertisements and / or to personalize the advertising content that you see.

http://thecomputer-science.blogspot.com/ has no access to or control over these cookies that are used by third-party advertisers.

You should consult the respective privacy policies of these third-party ad servers for more detailed information on their practices as well as for instructions about how to opt-out of certain practices. http://thecomputer-science.blogspot.com/'s privacy policy does not apply to, and we cannot control the activities of, such other advertisers or web sites.

If you wish to disable cookies, you may do so through your individual browser options. More detailed information about cookie management with specific web browsers can be found at the browsers' respective websites.

Monday, July 13, 2009

Theoretical computer science

The broader field of theoretical computer science encompasses both the classical theory of computation and a wide range of other topics that focus on the more abstract, logical, and mathematical aspects of computing.

 P \rightarrow Q \, \Gamma\vdash x : Int
Mathematical logic Automata theory Number theory Graph theory Type theory Category theory Computational geometry Quantum computing theory

Friday, July 10, 2009

Theory of computation

The study of the theory of computation is focused on answering fundamental questions about what can be computed, and what amount of resources are required to perform those computations. In an effort to answer the first question, computability theory examines which computational problems are solvable on various theoretical models of computation. The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a computational problem.

The famous "P=NP?" problem, one of the Millennium Prize Problems,[19] is an open problem in the theory of computation.

P = NP ?
Computability theory Computational complexity theory


Thursday, July 9, 2009

jvcxd8596m

jvcxd8596m

Fields of computer science

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software. The Computer Sciences Accreditation Board (CSAB) – which is made up of representatives of the Association for Computing Machinery (ACM), the Institute of Electrical and Electronics Engineers Computer Society, and the Association for Information Systems – identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, computer-human interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science

Monday, July 6, 2009

History of computer science

The history of computer science began long before the modern discipline of computer science that emerged in the twentieth century. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry.

Early history


Algorithms

In the 7th century, Indian mathematician Brahmagupta gave the first explanation of the Hindu-Arabic numeral system and the use of zero as both a placeholder and a decimal digit.

Approximately around the year 825, Persian mathematician Al-Khwarizmi wrote a book, On the Calculation with Hindu Numerals, that was principally responsible for the diffusion of the Indian system of numeration in the Middle East and then Europe. Around the 12th century, there was translation of this book written into Latin: Algoritmi de numero Indorum. These books presented newer concepts to perform a series of steps in order to accomplish a task such as the systematic application of arithmetic to algebra. By derivation from his name, we have the term algorithm.

Binary logic


Around the 3rd century BC, Indian mathematician Pingala discovered the binary numeral system. In this system, still used today to process all modern computers, a sequence of ones and zeros can represent any number.

In 1703, Gottfried Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.

By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punch cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems

Birth of computer science


Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars.

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." Alan Turing, known as the Father of Computer Science, invented such a logical computer known as the Turing Machine, which later evolved into the modern computer. These new computers were also able to perform non-numeric computations, like music.

From the time when computational processes were performed by human clerks, the study of computability began a science by being able to make evident which was not explicit into ordinary sense more immediate

Emergence of a discipline


The theoretical groundwork


The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.

1936 was a key year for computer science. Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing.

These topics are covered by what is now called the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.

Turing also included with the thesis a description of the Turing machine. A Turing machine has an infinitely long tape and a read/write head that can move along the tape, changing the values along the way. Clearly such a machine could never be built, but nonetheless, the model can simulate the computation of any algorithm which can be performed on a modern computer.

Turing is so important to computer science that his name is also featured on the Turing Award and the Turing test. He contributed greatly to British code-breaking successes in the Second World War, and continued to design computers and software through the 1940s, but committed suicide in 1954.

At a symposium on large-scale digital machinery in Cambridge, Turing said, "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus".

In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built - the Manchester Baby.

In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy.

Shannon and information theory


Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of his 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.

Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.

Wiener and Cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves

The first computer bug

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[1] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).