Proxies with Selenium, Puppeteer & Playwright

Code examples for adding a RelayKit USA 4G/5G mobile proxy to your browser automation scripts. Covers Playwright (Python & JavaScript), Puppeteer, and Selenium (Python & Java).

Overview

RelayKit proxies support both HTTP and SOCKS5 protocols on fixed ports. Use the format below based on your framework.

Proxy endpointsHTTP: http://USERNAME:PASSWORD@HOST:8000 SOCKS5: socks5://USERNAME:PASSWORD@HOST:9000

Replace HOST, USERNAME, and PASSWORD with the values from your dashboard: relaykit.net/dashboard → Proxies tab → click your order.

For most scraping use cases, Playwright is recommended — it has the best cross-browser support, built-in async, and native authenticated proxy support with a clean API. Puppeteer requires a workaround for authenticated proxies. Selenium requires a browser extension or seleniumwire for username/password authentication.

Playwright (Recommended)

Playwright has first-class support for authenticated proxies — pass the proxy object directly to launch(). Works with HTTP and SOCKS5.

Python

Python
from playwright.sync_api import sync_playwright with sync_playwright() as p: browser = p.chromium.launch( proxy={ "server": "http://HOST:8000", "username": "YOUR_USERNAME", "password": "YOUR_PASSWORD" } ) page = browser.new_page() page.goto("https://example.com") print(page.title()) browser.close()

JavaScript / Node.js

JavaScript
const { chromium } = require('playwright'); const browser = await chromium.launch({ proxy: { server: 'http://HOST:8000', username: 'YOUR_USERNAME', password: 'YOUR_PASSWORD' } }); const page = await browser.newPage(); await page.goto('https://example.com'); console.log(await page.title()); await browser.close();
Playwright async Python is also available via from playwright.async_api import async_playwright. The proxy config is identical — just use await syntax.

Puppeteer

Puppeteer doesn't natively support authenticated proxy credentials in launch() args. Pass the proxy server URL as a Chrome arg, then authenticate using page.authenticate().

JavaScript
const puppeteer = require('puppeteer'); const browser = await puppeteer.launch({ args: ['--proxy-server=http://HOST:8000'] }); const page = await browser.newPage(); // Authenticate on every new page await page.authenticate({ username: 'YOUR_USERNAME', password: 'YOUR_PASSWORD' }); await page.goto('https://example.com'); console.log(await page.title()); await browser.close();
Note: You must call page.authenticate() on each new page you open. If you open multiple pages in the same browser instance, authenticate each one individually.

Selenium

Selenium doesn't support authenticated proxies natively via ChromeOptions. The standard approach is to pass the proxy server without credentials in the Chrome args — then use seleniumwire (Python) for authenticated proxies.

Python (unauthenticated / seleniumwire)

Python
from selenium import webdriver from selenium.webdriver.chrome.options import Options # Basic setup (no auth — use IP whitelisting from RelayKit dashboard) options = Options() options.add_argument('--proxy-server=http://HOST:8000') driver = webdriver.Chrome(options=options) driver.get('https://example.com') driver.quit() # ── OR ── Use seleniumwire for authenticated proxy: # pip install selenium-wire from seleniumwire import webdriver as wire_webdriver wire_options = { 'proxy': { 'http': 'http://YOUR_USERNAME:YOUR_PASSWORD@HOST:8000', 'https': 'http://YOUR_USERNAME:YOUR_PASSWORD@HOST:8000', } } driver = wire_webdriver.Chrome(seleniumwire_options=wire_options) driver.get('https://example.com') driver.quit()
The easiest approach with Selenium is to enable IP Whitelisting on your RelayKit proxy (Dashboard → Proxies → Whitelist). Add your server IP — then you don't need username/password authentication at all.

Java

Java
import org.openqa.selenium.WebDriver; import org.openqa.selenium.chrome.ChromeDriver; import org.openqa.selenium.chrome.ChromeOptions; ChromeOptions options = new ChromeOptions(); options.addArguments("--proxy-server=http://HOST:8000"); WebDriver driver = new ChromeDriver(options); driver.get("https://example.com"); System.out.println(driver.getTitle()); driver.quit();

For authenticated proxies in Java, use a Chrome extension approach or BrowserMob Proxy. The simplest production approach remains IP whitelisting — add your server IP in RelayKit dashboard to skip credential auth entirely.

Using SOCKS5

SOCKS5 (port 9000) is supported by Playwright and most modern frameworks. Swap the server URL in your proxy config:

Python (Playwright)
browser = p.chromium.launch( proxy={ "server": "socks5://HOST:9000", "username": "YOUR_USERNAME", "password": "YOUR_PASSWORD" } )
JavaScript (Playwright)
const browser = await chromium.launch({ proxy: { server: 'socks5://HOST:9000', username: 'YOUR_USERNAME', password: 'YOUR_PASSWORD' } });

Tips for scraping with mobile proxies

Session management

RelayKit proxies are dedicated sticky lines — your IP stays the same for the entire rental period. This is ideal for scraping flows that require session continuity (login, pagination, checkout). If you need a fresh IP mid-scrape, use the dashboard's IP rotation button or set up auto-rotation.

Rotation strategy

For large-scale scraping across many targets, consider using multiple proxy orders across different US cities. Rotate between them in your code to distribute requests across carrier IPs. Each rotation from the dashboard gives you a new mobile IP.

Rate limiting

Even with mobile proxies, aggressive request rates can trigger bot detection. Add randomized delays between requests (500ms–3s is a common range). Mobile IPs raise your baseline trust score but won't override unrealistic request patterns.

IP whitelisting for production

In production environments, enable IP whitelisting on your RelayKit proxy and whitelist your server IP. This removes the need for credential-based auth in your code and adds a security layer — only your server can use the proxy.

Get your RelayKit proxy

Real 4G/5G carrier IPs for Playwright, Puppeteer, and Selenium. 50+ US cities, unlimited bandwidth.