WebToolsPlanet
developer Tools

User Agent Parser

Decode any User-Agent string — identify browser, engine, OS, device type, and bot classification.

Last updated: March 25, 2026

Client-Side Processing
Input Data Stays on Device
Instant Local Execution

Find this tool useful? Support the project to keep it free!

Buy me a coffee

What is User Agent Parser?

A User-Agent (UA) string is a text identifier sent in the HTTP headers of every browser and API request. It tells the server information about the client making the request: which browser (Chrome, Firefox, Safari), which rendering engine (Blink, Gecko, WebKit), which operating system (Windows 11, macOS 14, Android 14), and whether it's a mobile device, desktop, or bot. A typical Chrome desktop UA string looks like: "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36".

Despite looking like structured data, UA strings are notoriously complex to parse correctly. Every browser includes spoofed compatibility tokens from other browsers (Chrome says both "AppleWebKit" and "Safari"), making naive substring matching highly unreliable. This tool uses ua-parser-js — the most widely deployed JavaScript UA parser library — to correctly identify browser family, major version, OS, and device type, including detection of bots, crawlers, and CLI tools.

How to Use User Agent Parser

1

Your current browser's User-Agent string is automatically pre-filled in the input

2

The detected browser, OS, engine, and device type are shown immediately

3

To parse a different UA, clear the field and paste any User-Agent string

4

Use the "Copy Structure" button to copy the parsed result as a JSON object

5

Use the "Is Bot?" indicator to quickly check whether the UA represents a crawler or automated tool

Common Use Cases

  • Debugging why a website behaves differently on specific browsers or operating systems
  • Checking what User-Agent your mobile app, API client, or CLI tool sends to servers
  • Testing whether a web crawler or bot is being correctly detected by your backend
  • Identifying whether a suspicious log entry is from a human browser or automated scraper
  • Validating User-Agent header values in API testing tools (Postman, Insomnia, curl)
  • Decoding User-Agents from server access logs to analyze visitor device breakdown
  • Testing User-Agent spoofing in browser automation (Playwright, Puppeteer, Selenium)
  • Checking whether an old browser or OS version your users report is still supported by your app

Example Input and Output

Parsing User-Agent strings from different contexts:

User-Agent string
Chrome Desktop:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36

Googlebot:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

iPhone Safari:
Mozilla/5.0 (iPhone; CPU iPhone OS 17_3 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.3 Mobile/15E148 Safari/604.1
Parsed structured data
Chrome Desktop:
→ Browser: Chrome 122.0.0.0 | Engine: Blink
→ OS: Windows 10 (64-bit) | Device: Desktop | Bot: No

Googlebot:
→ Browser: Googlebot 2.1 | Type: Bot/Crawler
→ OS: — | Device: — | Bot: YES

iPhone Safari:
→ Browser: Safari 17.3 | Engine: WebKit
→ OS: iOS 17.3 | Device: iPhone (Mobile) | Bot: No

Client-Side Processing

UA parsing runs locally using ua-parser-js. UA strings you paste are not sent to our servers. Note: the tool reads your browser's navigator.userAgent to pre-fill the field — only the parsing logic runs locally.

Reduced UA Strings (Chrome 110+)

Chrome 110+ sends a reduced User-Agent string by default — the minor version numbers are replaced with zeros (e.g., Chrome/122.0.0.0 instead of 122.0.6261.112). This is part of the User-Agent Reduction rollout. The major version and device type remain accurate; only patch-level version info is hidden.

Server-Side Access to UA

From a server, read the User-Agent from the HTTP header: Node.js/Express: req.headers['user-agent']. PHP: $_SERVER['HTTP_USER_AGENT']. Python/FastAPI: request.headers.get("user-agent"). Nginx log: $http_user_agent variable. Then use ua-parser-js (Node) or ua-parser (Python) for structured parsing.

Frequently Asked Questions

Why do all browsers include "Mozilla/5.0" at the start?
Historical accident. When Netscape Navigator was dominant, websites detected "Mozilla" in the UA string to serve advanced features. When IE was released, it added "Mozilla/5.0 compatible" to trick servers into thinking it was Netscape-capable. Then Firefox and Chrome inherited this "Mozilla/5.0" prefix for backward compatibility. Today every browser includes it even though it's meaningless — it's a vestige of the 1990s browser wars.
Why is User-Agent detection considered unreliable?
UA strings are self-reported — any client can send any string it wants. Chrome intentionally includes "Safari" and "Chromium" tokens. Firefox includes "Gecko". Edge includes "Chrome". This deliberate spoofing means simple substring matching gives wrong results. Additionally, users can change their UA string in browser DevTools or extensions. UA parsing is useful for analytics and rough detection but should never be used for security decisions.
What is User-Agent Client Hints (UA-CH) and does this affect parsing?
User-Agent Client Hints is a newer privacy-preserving alternative to UA strings, being rolled out in Chrome. In UA-CH, the UA string is intentionally reduced to generic info (the browser version is often listed as "major.0") while structured data can be requested via dedicated headers (Sec-CH-UA, Sec-CH-UA-Platform). This tool parses the classic UA string format; UA-CH headers require server-side access.
How can I detect bots and crawlers reliably?
The UA string is the first check — bots usually identify themselves (Googlebot, AhrefsBot, SemrushBot). ua-parser-js has a bot detection database. However, sophisticated scrapers spoof legitimate Chrome UAs. For server-side bot detection, combine UA checking with: IP reputation lists, request rate analysis, timezone/screen resolution validation (JavaScript), and honeypot links.
What is the User-Agent string format for major crawlers?
Googlebot: "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)". Bingbot: "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)". GPTBot: "GPTBot/1.0 (+https://openai.com/gptbot)". CCBot (Common Crawl): "CCBot/2.0 (https://commoncrawl.org/faq/)".
Can I use this to see my own browser's User-Agent?
Yes — the tool auto-fills your current browser's UA string on load, so you can immediately see how it's parsed. This is useful for verifying your browser's version, checking if you're on a mobile emulation mode in DevTools, or seeing what information websites receive when you visit them.

How This Tool Works

The browser's navigator.userAgent string is read and pre-populated in the input on load. The input value (or pasted UA string) is passed to ua-parser-js, which applies a large database of regular expression patterns to identify browser family, version, rendering engine, operating system, and device type. A separate "is bot" check compares the UA against known bot and crawler patterns. The parsed result is displayed as structured data fields and also as a copyable JSON object.

Technical Stack

ua-parser-js libraryBrowser navigator.userAgent APIBot detection patternsClient-side only