Product Code Database
Example Keywords: medical -pants $32-109
   » » Wiki: N-gram
Tag Wiki 'N-gram'.
Tag

An n-gram is a sequence of n adjacent symbols in a particular order.

(2025). 9780121709600
The symbols may be n adjacent letters (including and blanks), , or rarely whole found in a language dataset; or adjacent extracted from a speech-recording dataset, or adjacent base pairs extracted from a genome. They are collected from a or .

If Latin numerical prefixes are used, then n-gram of size 1 is called a "unigram", size 2 a "" (or, less commonly, a "digram") etc. If, instead of the Latin ones, the English cardinal numbers are furtherly used, then they are called "four-gram", "five-gram", etc. Similarly, Greek numerical prefixes such as "monomer", "dimer", "trimer", "tetramer", "pentamer", etc., or English cardinal numbers, "one-mer", "two-mer", "three-mer", etc. are used in computational biology for or of a known size, called . When the items are words, -grams may also be called shingles.

In the context of natural language processing (NLP), the use of n-grams allows bag-of-words models to capture information such as word order, which would not be possible in the traditional bag of words setting.


Examples
In 1951, Shannon, Claude E. "The redundancy of English." Cybernetics; Transactions of the 7th Conference, New York: Josiah Macy, Jr. Foundation. 1951. discussed n-gram models of English. For example:

  • 3-gram character model (random draw based on the probabilities of each trigram): in no ist lat whey cratict froure birs grocid pondenome of demonstures of the retagin is regiactiona of cre
  • 2-gram word model (random draw of words taking into account their transition probabilities): the head and in frontal attack on an english writer that the character of this point is therefore another method for the letters that the time of who ever told the problem for an unexpected

+ Figure 1. n-gram examples from various disciplines ! Field !! Unit !!Sample sequence !! 1-gram sequence !! 2-gram sequence !! 3-gram sequence
..., Cys-Gly-Leu, Gly-Leu-Ser, Leu-Ser-Trp, ...
..., AGC, GCT, CTT, TTC, TCG, CGA, ...
..., to_, o_b, _be, be_, e_o, _or, or_, r_n, _no, not, ot_, t_t, _to, to_, o_b, _be, ...
..., to be or, be or not, or not to, not to be, ...

Figure 1 shows several example sequences and the corresponding 1-gram, 2-gram and 3-gram sequences.

Here are further examples; these are word-level 3-grams and 4-grams (and counts of the number of times they appeared) from the Google n-gram corpus.

3-grams

  • ceramics collectables collectibles (55)
  • ceramics collectables fine (130)
  • ceramics collected by (52)
  • ceramics collectible pottery (50)
  • ceramics collectibles cooking (45)

4-grams

  • serve as the incoming (92)
  • serve as the incubator (99)
  • serve as the independent (794)
  • serve as the index (223)
  • serve as the indication (72)
  • serve as the indicator (120)


Further reading
  • Manning, Christopher D.; Schütze, Hinrich; Foundations of Statistical Natural Language Processing, MIT Press: 1999,
  • Damerau, Frederick J.; Markov Models and Linguistic Theory, Mouton, The Hague, 1971
  • (2025). 9781479901685


See also
  • Google Books Ngram Viewer


External links

Page 1 of 1
1
Page 1 of 1
1

Account

Social:
Pages:  ..   .. 
Items:  .. 

Navigation

General: Atom Feed Atom Feed  .. 
Help:  ..   .. 
Category:  ..   .. 
Media:  ..   .. 
Posts:  ..   ..   .. 

Statistics

Page:  .. 
Summary:  .. 
1 Tags
10/10 Page Rank
5 Page Refs