repeated character compression

repeated character compression
компрессия повторяющихся символов (метод сжатия данных путем замены повторяющихся символьных цепочек трехбайтовой комбинацией)

Большой англо-русский и русско-английский словарь. 2001.

Игры ⚽ Нужно решить контрольную?

Смотреть что такое "repeated character compression" в других словарях:

  • data compression — Process of reducing the amount of data needed for storage or transmission of a given piece of information (text, graphics, video, sound, etc.), typically by use of encoding techniques. Data compression is characterized as either lossy or lossless …   Universalium

  • Lossless data compression — is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be… …   Wikipedia

  • syndrome — The aggregate of symptoms and signs associated with any morbid process, and constituting together the picture of the disease. SEE ALSO: disease. [G. s., a running together, tumultuous concourse; (in med.) a concurrence of symptoms, fr. syn,… …   Medical dictionary

  • architecture — /ahr ki tek cheuhr/, n. 1. the profession of designing buildings, open areas, communities, and other artificial constructions and environments, usually with some regard to aesthetic effect. Architecture often includes design or selection of… …   Universalium

  • Huffman coding — Huffman tree generated from the exact frequencies of the text this is an example of a huffman tree . The frequencies and codes of each character are below. Encoding the sentence with this code requires 135 bits, as opposed of 288 bits if 36… …   Wikipedia

  • Burrows-Wheeler transform — The Burrows Wheeler transform (BWT, also called block sorting compression), is an algorithm used in data compression techniques such as bzip2. It was invented by Michael Burrows and David Wheeler in 1994 while working at DEC Systems Research… …   Wikipedia

  • Lempel-Ziv-Welch — (LZW) is a universal lossless data compression algorithm created by Abraham Lempel, Jacob Ziv, and Terry Welch. It was published by Welch in 1984 as an improved implementation of the LZ78 algorithm published by Lempel and Ziv in 1978. The… …   Wikipedia

  • information theory — the mathematical theory concerned with the content, transmission, storage, and retrieval of information, usually in the form of messages or data, and esp. by means of computers. [1945 50] * * * ▪ mathematics Introduction       a mathematical… …   Universalium

  • Algorithmic efficiency — In computer science, efficiency is used to describe properties of an algorithm relating to how much of various types of resources it consumes. Algorithmic efficiency can be thought of as analogous to engineering productivity for a repeating or… …   Wikipedia

  • bridge — bridge1 bridgeable, adj. bridgeless, adj. bridgelike, adj. /brij/, n., v., bridged, bridging, adj. n. 1. a structure spanning and providing passage over a river, chasm, road, or the like. 2. a connecting, transitional, or intermediate route or… …   Universalium

  • Wikipedia:Reference desk/Computing — The Wikipedia Reference Desk covering the topic of computing. Computing #eee #f5f5f5 #eee #aaa #aaa #aaa #00f #36b #000 #00f computing Wikipedia:Reference de …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»