Programming, Algorithm, and Flowchart

Programming, Algorithm, and Flowchart

Programming
Computer programming (often shortened to programming or coding) is the process of writing, testing, debugging/troubleshooting, and maintaining the source code of computer programs. This source code is written in a programming language. The code may be a modification of an existing source or something completely new. The purpose of programming is to create a program that exhibits a certain desired behavior (customization). The process of writing source codes requires expertise in many different subjects, including knowledge of the application domain, specialized algorithms and formal logic.
Computer Programmers are those who write computer software. Their job usually involves:
  • Requirements analysis
  • Specification
  • Software architecture
  • Coding
  • Compilation
  • Software testing
  • Documentation
  • Integration
  • Maintenance

Programming languages

Different programming languages support different styles of programming (called programming paradigms). The choice of language used is subject to many considerations, such as company policy, suitability to task, availability of third-party packages, or individual preference. Ideally, the programming language best suited for the task at hand will be selected. Trade-offs from this ideal involve finding enough programmers who know the language to build a team, the availability of compilers for that language, and the efficiency with which programs written in a given language execute.

list of programming languages

1. C++
2. Basic
3. Java
4. Pascal
5. Fortran
6. Cobol
7. etc..

Quality requirements

  • Efficiency: it is referred to the system resource consumption (computer processor, memory, slow devices, networks and to some extent even user interaction) which must be the lowest possible.
  • Reliability: the results of the program must be correct, which not only implies a correct code implementation but also reduction of error propagation (e.g. resulting from data conversion) and prevention of typical errors (overflow, underflow or zero division).
  • Robustness: a program must anticipate situations of data type conflict and all other incompatibilities which result in run time errors and stop the program. The focus of this aspect is the interaction with the user and the handling of error messages.
  • Portability: it should work as it is in any software and hardware environment, or at least without relevant reprogramming.
  • Readability: the purpose of the main program and of each subroutine must be clearly defined with appropriate comments and self explanatory choice of symbolic names (constants, variables, function names, classes, methods, ...).

Algorithmic complexity

The academic field and the engineering practice of computer programming are both largely concerned with discovering and implementing the most efficient algorithms for a given class of problem. For this purpose, algorithms are classified into orders using so-called Big O notation, O(n), which expresses resource use, such as execution time or memory consumption, in terms of the size of an input. Expert programmers are familiar with a variety of well-established algorithms and their respective complexities and use this knowledge to choose algorithms that are best suited to the circumstances.

Methodologies

The first step in most formal software development projects is requirements analysis, followed by modeling, implementation, and failure elimination (debugging). There exist a lot of differing approaches for each of those tasks. One approach popular for requirements analysis is Use Case analysis.
Popular modeling techniques include Object-Oriented Analysis and Design (OOAD) and Model-Driven Architecture (MDA). The Unified Modeling Language (UML) is a notation used for both OOAD and MDA.
A similar technique used for database design is Entity-Relationship Modeling (ER Modeling).
Implementation techniques include imperative languages (object-oriented or procedural), functional languages, and logic languages.
Debugging is most often done with IDEs like Visual Studio, NetBeans, and Eclipse. Separate debuggers like gdb are also used.

Debugging

Debugging is a very important task in the software development process, because an erroneous program can have significant consequences for its users. Some languages are more prone to some kinds of faults because their specification does not require compilers to perform as much checking as other languages. Use of a static analysis tool can help detect some possible problems.
Algorithm
In mathematics, computing, linguistics and related disciplines, an algorithm is a type of effective method in which a list of well-defined instructions for completing a task will, when given an initial state, proceed through a well-defined series of successive states, eventually terminating in an end-state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as probabilistic algorithms, incorporate randomness.

Expressing algorithms

Algorithms can be expressed in many kinds of notation, including natural languages, pseudocode, flowcharts, and programming languages. Natural language expressions of algorithms tend to be verbose and ambiguous, and are rarely used for complex or technical algorithms. Pseudocode and flowcharts are structured ways to express algorithms that avoid many of the ambiguities common in natural language statements, while remaining independent of a particular implementation language. Programming languages are primarily intended for expressing algorithms in a form that can be executed by a computer, but are often used as a way to define or document algorithms.
Algorithm LargestNumber
  Input: A non-empty list of numbers L.
  Output: The largest number in the list L.
 
  largestL0
  for each item in the list L≥1, do
    if the item > largest, then
      largest ← the item
  return largest

FlowChart

Symbols

A typical flowchart from older Computer Science textbooks may have the following kinds of symbols:
  • Start and end symbols, represented as lozenges, ovals or rounded rectangles, usually containing the word "Start" or "End", or another phrase signaling the start or end of a process, such as "submit enquiry" or "receive product".
  • Arrows, showing what's called "flow of control" in computer science. An arrow coming from one symbol and ending at another symbol represents that control passes to the symbol the arrow points to.
  • Processing steps, represented as rectangles. Examples: "Add 1 to X"; "replace identified part"; "save changes" or similar.
  • Input/Output, represented as a parallelogram. Examples: Get X from the user; display X.
  • Conditional (or decision), represented as a diamond (rhombus). These typically contain a Yes/No question or True/False test. This symbol is unique in that it has two arrows coming out of it, usually from the bottom point and right point, one corresponding to Yes or True, and one corresponding to No or False. The arrows should always be labeled. More than two arrows can be used, but this is normally a clear indicator that a complex decision is being taken, in which case it may need to be broken-down further, or replaced with the "pre-defined process" symbol.
  • A number of other symbols that have less universal currency, such as:
    • A Document represented as a rectangle with a wavy base;
    • A Manual input represented by rectangle, with the top irregularly sloping up from left to right. An example would be to signify data-entry from a form;
    • A Manual operation represented by a trapezoid with the longest parallel side at the top, to represent an operation or adjustment to process that can only be made manually.
    • A Data File represented by a cylinder
Ref: http://en.wikipedia.com

No comments: