
GLOSSARY  LOGIC 

BIT
 The basis of computing, a BIT is a unit of information with a binary attribute, i.e two possible states: on/off, +/, true/false, yes/no, or, most commonly, in binary notation, 1 or 0. It is in this sense, the ones and zeros represented by tiny charges in silicon chips, the building blocks of modern information technology, that the BIT is ubiquitous.
The modern binary system was codified by the philosopher Leibniz in the 17th Century. Leibniz was a polymath whose wide ranging interests included law, history and linguistics, and who invented calculus independently of Newton (his notation is still the standard in maths). But it his work on logic and the binary system which has been of fundamental importance to the modern world of computing, and thus to digital imaging. Interestingly, Leibniz was fascinated by the orient and had studied the Chinese I Ching and its hexagrams that map to the binary numbers 0 to 111111. Perhaps Yin and Yang can be considered the oldest conceptually binary attributes.
Using Leibniz's mechanisms and logic to perform mathematical calculations.
Leibniz used his system to invent a mechanical calculating machine he catchily called the "stepped reckoner". Way ahead of its time it challenged the engineers of the day in its mechanical complexity, but two were eventually built. The mechanism Leibniz conceived for the reckoner was the blueprint of mechanical calculating machines right up to the Curta Calculator (see pic) which was developed in 1948 and was in use in some spheres into the 1980s before electronic calculating devices became reliable in adverse conditions. Rally teams were late users, and I'm sure they would have been seen on some film sets!
Leibniz was also one of the great 17th century rationalists along with Descartes and Spinoza, and in philosophy is remembered mainly for his optimism, the archetype of Voltaire's Dr Pangloss who believed "everything is for the best in this, the best of all possible worlds". What a guy.
The concept of the BIT and the binary system was taken forward by an englishman George Boole in the mid 18th century. Using the binary system to develop his ideas about logic he devised BOOLEAN ALGEBRA, the basis of modern computational computing.

BYTE
 A BYTE is a unit of digital information comprising a number of BITS. In general terms 8 BITS make a BYTE. An 8bit number, for instance 10111011, can be used to represent the decimal numbers 0 to 255, or 256 values.
The fact that such an 8bit number was used in early computer systems, particularly home computers, to represent alphabetical characters and symbols has meant that the 8bit byte has become a de facto standard. This is despite the fact that its orignal meaning was just a number of BITs, the actual number being dependent on the system in use. Early computers used 4 or 7bit BYTEs as the smallest chunks of information their processors would address, modern computers process in 16, 32, 64bit or greater chunks.
The name BYTE was coined by an IBM scientist. The idea of chunking BITs together into bitesized chunks that the processor could "bite" into lead to the term BYTE, an intentional misspelling of "bite" that could not be accidentally shortened to "BIT" by leaving the "e" off.
Despite the changed spelling the term causes much confusion. BITs are usually represented by a small "b", BYTEs by a large "B" (although there is a conflict with the International System of Units where "B" represents bel as in decibel). This can lead to misunderstndings especially where DATA RATES are concerned. Perhaps he should have called his unit a dollop or a slug. An "octet" is an alternative term specifically used to denote an 8bit byte but is rarely used except in engineering.
Although these days processors rarely use only 8 bits in their architecture, the 8bit byte derived from its use in alphabetical character encoding is still the standard unit of data storage. If you open a text editor or a word processor on your computer, create a new document and save the following as a text file:
"WE WERE SOMEWHERE AROUND BARSTOW ON THE EDGE OF THE DESERT WHEN THE DRUGS BEGAN TO TAKE HOLD..."
(Be careful to cut and paste and include the quotation marks.)
...you should find that the file size is exactly 100 bytes!

QUBIT
 A QUBIT is a "Quantum BIT"
The QUBIT has the same two states as a classical BIT, 1 / 0, on / off, etc. However, it also has the ability to be a superposition of the two states  i.e both 1 and 0.
Superposition is one of the mindblowing concepts of quantum physics: the superposed state is a property predicted by Schrodinger's equation, he of catinthebox fame. (The cat is both dead and alive until you open the box and look!)
This may seem esoteric but scientists are actually using spinning electrons in semiconductors as qubits to perform elementary logic operations, and classical supercomputers use quantum simulation for solving complex equations in astrophysics and climate change. This technology would be perfect for incredibly powerful video processing.
Who knows, the Playstation 16, the RED Epic plus plus plus (where do you go from epic?), the Apple iBrain etc., may all use quantum computing in 50 years time, creating multiple alternative hyperrealities in our minds.
Film will still be better though. It's 'cause the light goes into the emulsion!

TIMECODE
 Information recorded alongside the picture detailing the time (either real time or an arbitrary reference time) each frame is captured.
Takes the form hh:mm:ss:ff (Where ff is frames i.e 1 to 24 at 24fps for cinema).
Used in editing systems to allow frame accurate cutting points.TIMECODE is not the same as GENLOCK.
There are several systems of TC in use but most important for film and digital cinema use are the various SMPTE timecode standards found in most high end camera systems. The most commonly used system in digital cinema cameras is LTC  Longitudinal Time Code where timecode is recorded alongside the picture, effectively as an extra audio track.
The TIMEBASE for SMPTE LTC timecode can be 24, 25 or 30 fps and each frame records an 80 BIT code containing the time itself (as hh:mm:ss:ff) plus extra METADATA including "user bits" and flags for DROPFRAME and other technical information.
The bit rate for LTC ranges from 1920Hz to 4800Hz depending on TIMEBASE and actual code recorded for each frame, a frequency range that when played as an audio track through a speaker gives a high pitched pulsing sound similar to that of old fashioned modems.
Timecode generators are never 100% accurate and where live action footage from multiple cameras needs to be cut together (and it is not practical to use a clapper board) cameras must be "jammed" or SYNCed to each other or to an external clock either continuously or at regular intervals to avoid the cameras' internal clocks drifting too far apart from each other.
THIS PAGE IS A GOOD SUMMARY OF TIMECODE

BOOLEAN LOGIC
 The system of logic that is the basis for modern computing.
An Englishman born in Lincoln, George Boole developed his ideas of logic and algebra as the first Professor of Mathematics at the University of Cork in the mid 18th century. A largely self taught prodigy he was influenced by religion and mystical ideas, claiming to have had a revelation as a young man which initially inspired him to join the church, in fact though he went on to dedicate his life to academic study. He become the first to build on Leibniz's ideas successfully combining the the philosophical field of logic and mathematical algebra. He fathered several daughters who were notable in their own right including Lucy Everest, the first female professor of chemistry in England.
The impact of Boole's ideas was wide ranging, but the key development that was crucial for the development some years later of computing (and therefore digital cinematography) is twoelement Boolean algebra, often known as "switching algebra".
Switching algebra uses simple logic operations as its building blocks. These logic operations operate on BITS and can be replicated by simple mechanisms or circuits utilising vacuum tubes or transistors. It is arrays of millions of tiny transistors performing logic operations sequentially that constitute the silicon chip. When combined appropriately these logic operations can mimic the the most complex mathematical functions. Boole's algebra, later perfected by Claude Shannon (an electronic engineer and mathematician whose work at MIT was initially aimed at telephonic switching, but whose thesis became the basis of wartime cryptography and modern computing, and is sometimes hailed as the most important scientific paper of all time!) Using Boolean algebra to efficiently combine simple logic operations into circuits performing complex functions is the main challenge of the circuit engineer today.
The three basic logic operations that form the basis of switching algebra are as follows:
The NOT gate:
 one input, one output, the output is the opposite of the input.
The AND gate:
 two inputs, one output, the output is 1 if both one input AND the other input is 1
Input 
Input 
Output 
0 
0 
0 
1 
0 
0 
0 
1 
0 
1 
1 
1 
The OR gate:
 two inputs, one output, the output is 1 if one input OR the other input is 1
Input 
Input 
Output 
0 
0 
0 
1 
0 
1 
0 
1 
1 
1 
1 
1 
Other commonly used but basic operations NAND, NOR, XAND and XOR can be made by combining these three most basic operations.


CREDITS: Thanks to Roger Bowles ACO for his continued work in compiling this glossary.
