browser_create_context
Create a new isolated browser context with its own cookies, storage, and optional proxy configuration. Each context acts as an independent browser session. Use this to create multiple isolated browsing sessions, configure proxy/Tor connections, load browser profiles with saved fingerprints, and enable/disable LLM features. Returns a context_id to use with other browser tools.
Usage Example
Parameters
Optional
profile_pathstringPath to a browser profile JSON file (or upload file via multipart/form-data). If the file exists, loads fingerprints, cookies, and settings. If not, creates a new profile and saves it to this path. Encrypted profiles (from browser_download_profile) are automatically detected and decrypted.
osenumwindowsmacoslinuxFilter profiles by operating system. If set, only profiles matching this OS will be used. Options: 'windows', 'macos', 'linux'
gpustringFilter profiles by GPU vendor/model. If set, only profiles with matching GPU will be used. Examples: 'nvidia', 'amd', 'intel'
screen_sizeenum1920x10802560x14403440x14403840x2160+3 moreScreen resolution for the browser context. Format: 'WIDTHxHEIGHT'. If not set, a random screen size is selected from the monitor catalog.
timezonestringOverride browser timezone. IANA timezone format (e.g., 'America/New_York', 'Europe/London', 'Asia/Tokyo'). If not set, falls back to: 1) proxy-detected timezone (if proxy configured with spoof_timezone), 2) VM profile timezone, 3) system default. This parameter works without proxy configuration.
resource_blockingbooleanEnable or disable resource blocking (ads, trackers, analytics). When enabled, blocks requests to known ad networks, trackers, and analytics services. Default: true (enabled)
agent_signaturebooleanEnable Web Bot Auth (RFC 9421) request signing for this context. When enabled, every outgoing HTTP request is signed with Ed25519 Signature, Signature-Input, and Signature-Agent headers. Requires OWL_WBA_ENABLED=true and OWL_WBA_CONTACTS set. Default: false
proxy_typeenumhttphttpssocks4socks5+2 moreType of proxy server to use. Options: 'http', 'https', 'socks4', 'socks5', 'socks5h', 'gae'. Use 'socks5h' for remote DNS resolution (recommended for privacy). Use 'gae' for private app proxy
proxy_hoststringProxy server hostname or IP address (e.g., '127.0.0.1' or 'proxy.example.com')
proxy_portstringProxy server port number (e.g., 8080 for HTTP proxy, 9050 for Tor)
proxy_usernamestringUsername for proxy authentication. Only required if the proxy server requires credentials
proxy_passwordstringPassword for proxy authentication. Only required if the proxy server requires credentials
proxy_stealthbooleanEnable stealth mode to prevent proxy/VPN detection. Blocks WebRTC leaks and other detection vectors. Default: true when proxy is configured
proxy_ca_cert_pathstringPath to custom CA certificate file (.pem, .crt, .cer) for SSL interception proxies. Required when using Charles Proxy, mitmproxy, or similar HTTPS inspection tools
proxy_ca_key_pathstringPath to CA private key file for GAE/private app proxy MITM. Required for generating per-domain certificates when using 'gae' proxy type
proxy_trust_custom_cabooleanTrust the custom CA certificate for SSL interception. Enable when using Charles Proxy, mitmproxy, or similar tools that intercept HTTPS traffic. Default: false
is_torbooleanExplicitly mark this proxy as a Tor connection. Enables circuit isolation so each context gets a unique exit node IP. Auto-detected if proxy is localhost:9050 or localhost:9150 with socks5/socks5h
tor_control_portstringTor control port for circuit isolation. Used to send SIGNAL NEWNYM to get a new exit node. Default: auto-detect (tries 9051 then 9151). Set to -1 to disable circuit isolation
tor_control_passwordstringPassword for Tor control port authentication. Leave empty to use cookie authentication (default) or no auth
llm_enabledbooleanEnable or disable LLM features for this context. When enabled, allows using AI-powered tools like browser_query_page, browser_summarize_page, and browser_nla. Default: true
llm_use_builtinbooleanUse the built-in llama-server for LLM inference. When true, uses the bundled local model. Set to false to use an external LLM provider. Default: true
llm_endpointstringExternal LLM API endpoint URL (e.g., 'https://api.openai.com/v1' for OpenAI). Only used when llm_use_builtin is false
llm_modelstringExternal LLM model name (e.g., 'gpt-4-vision-preview' for OpenAI). Only used when llm_use_builtin is false
llm_api_keystringAPI key for the external LLM provider. Required when using external LLM endpoint
Response
Returns a JSON object with the operation result.
{
"success": true,
"result": <value>
}