What is a "unique word"?
Words counted once regardless of how many times they appear. "The cat ate the cat food" has 6 words but 5 unique words (the appears twice). It's a measure of vocabulary richness.
Count words, sentences, reading time and keyword density
Top Keywords
Based on average adult reading speed of ~200 words per minute. Total words ÷ 200 = minutes. Dense technical content will take longer in practice.
Words counted once regardless of how many times they appear. "The cat ate the cat food" has 6 words but 5 unique words (the appears twice). It's a measure of vocabulary richness.
"Hello world" is 2 words and 11 characters (including the space). Character count includes spaces, punctuation, and special characters. Some character limits (like Twitter) count in code points, not bytes.
Word count is a deceptively simple metric. Different contexts have wildly different requirements. Twitter/X: 280 characters per tweet. LinkedIn posts: optimal engagement at 1,300-2,000 characters. Blog posts for SEO: 1,500-2,500 words (research suggests longer is better for ranking, but only if the content is substantive). Academic papers: usually 5,000-8,000 words. Email newsletters: highest open and click rates tend to be 200-300 words. Knowing your target count before you write is more efficient than trimming or padding afterward.
The 200 words-per-minute figure for average adult reading speed is derived from research on silent reading. The actual range is wide: slower readers process 100-150 wpm; speed readers can reach 700+ wpm with some loss of comprehension. Speaking speed is slower -- about 130 wpm for normal speech, 150-160 wpm for presentations. Audiobook narrators typically target 150-180 wpm. These numbers inform the "X min read" labels you see on articles, but they're estimates -- dense technical content takes longer than the word count suggests.
In the early days of SEO, keyword density (the percentage of times a target word appears relative to total words) was a significant ranking factor. If "blue widgets" appeared at 3-5% density, you ranked higher. This led to keyword stuffing -- awkward, repetitive content that was painful to read. Google's algorithms evolved. Today, keyword density itself isn't a ranking signal; semantic relevance is. Google's NLP models understand synonyms, related concepts, and topic clusters. Writing naturally about your topic ranks better than mechanically inserting keywords. Track keyword presence, but don't optimize for density.
The Flesch Reading Ease score (0-100) measures how easy English text is to read. It factors in average sentence length and average word length (syllables). A score of 70+ is "easy" (aimed at 7th grade level); below 30 is "very difficult" (professional/academic). The Flesch-Kincaid Grade Level converts this to a US school grade equivalent. Content marketers targeting general audiences aim for 60-70+. Legal documents often score 30-40. If you write developer documentation, a 50-60 score is appropriate -- technical vocabulary will always reduce readability scores.
AI language models like GPT-4 don't count words -- they count tokens. A token is roughly a word or word piece, but the mapping isn't 1:1. In English, 1 token ≈ 0.75 words. A 1,000-word document is about 1,333 tokens. Token limits matter for AI APIs -- GPT-4's context window is 128,000 tokens (about 96,000 words). Rare words, technical terms, and non-Latin scripts (Chinese, Japanese, Korean, Arabic) tokenize less efficiently. A single Korean character might be 2-3 tokens. This affects cost and context limits when using AI APIs.