Informatics. Information. Alphabet.

Informatics is a science about information processes, about models, about algorithms and algorithmization, about programs and programming, about executors of algorithms and various executing systems, about their use in society, in nature, in cognition [1].

3 main branches:

1. Theoretical computer science studies the theoretical problems of information environments.

2.Practical computer science studies the practical problems of information environments.

3.Technical informatics studies the technical problems of information environments.

Alphabet – a finite set of different signs, symbols, for which the concatenation operation is defined (attaching a symbol to a symbol or a chain of symbols); with its help, according to certain rules for connecting symbols and words, you can get words (chains of characters) and phrases (chains of words ) in this alphabet (above this alphabet ).

Any element of x is called a letter or sign.

Length |p| of some word p over the alphabet X is the number of its constituent symbols.

Information is some ordered sequence of messages that have a specific meaning.

14 Main properties of information :

  • completeness;
  • relevance;
  • adequacy;
  • understandability;
  • authenticity;
  • mass character;
  • stability;
  • value, etc.

· Information is updated with the help of various forms of messages – a certain type of signals, symbols.

· Information in relation to the source or receiver is of three types: input, output and internal .

Information in relation to the final result is initial, intermediate and resulting .

· Information on the stage of its use is primary and secondary .

Information on its completeness is redundant, sufficient and insufficient .

· Information on access to it is open and closed .

measure of information.

Any messages are measured in bytes , kilobytes , megabytes , gigabytes , terabytes, petabytes and exabytes , and are encoded, for example, in a computer, using an alphabet of zeros and ones, recorded and implemented in a computer in bits .

Here are the main relationships between the units of measurement of messages :

1 bit (binary digit) = 0 or 1,

1 byte = 8 bits ,

1 kilobyte (1Kb) = 2 13 bits ,

1 megabyte (1Mb) = 2 23 bits ,

1 gigabyte (1GB) = 2 33 bits ,

1 terabyte (1TB) = 2 43 bits ,

1 petabyte (1Pb) = 2 53 bits ,

1 exabyte (1Eb) = 2 63 bits .

The measure of information is a criterion for assessing the amount of information. Usually it is given by some non-negative function defined on the set of events and being additive, that is, the measure of the final union of events (sets) is equal to the sum of the measures of each event.

Consider various measures of information. Let’s take R. Hartley’s measure. Let N states of the system S be known (N is the number of experiments with different, equally possible, successive states of the system). If each state of the system is encoded in binary codes, then the code length d must be chosen so that the number of all different combinations is not less than N:

Taking the logarithm of this inequality, we can write:

The smallest solution to this inequality or a measure of the diversity of the set of system states is given by R. Hartley ‘s formula:

H = log 2 N ( bit ).

If in the set X = {x 1 , x 2 , …, x n } to look for an arbitrary element, then to find it (according to Hartley) it is necessary to have at least log 2 n (units) of information .

A decrease in H indicates a decrease in the variety of states N of the system.

An increase in H indicates an increase in the variety of states N of the system.

The Hartley measure is suitable only for ideal, abstract systems, since in real systems the states of the system are not equally probable.

For such systems, a more suitable Shannon measure is used. The Shannon measure evaluates information abstractly from its meaning.

Be First to Comment

Leave a Reply

Your email address will not be published.