Using HTTP Proxies in Java: HttpClient, OkHttp, Jsoup & More

A code-first guide to configuring HTTP proxies in Java 17+. Covers Java 11+ HttpClient, OkHttp, Jsoup, Apache HttpClient, connection pooling, TLS, and parallel scraping with residential proxy pools.

Using HTTP Proxies in Java: HttpClient, OkHttp, Jsoup & More

If you're building web scrapers, API clients, or automation tools in Java, you'll eventually hit rate limits, geo-blocks, or IP bans. That's when you need a Java HTTP proxy solution. This guide shows you exactly how to configure proxies across the major Java HTTP clients—Java 11+ HttpClient, OkHttp, Jsoup, and Apache HttpClient—with production-ready patterns for authentication, connection pooling, retries, and parallel scraping.

Why Proxy Support Matters in Java Applications

Modern web infrastructure treats suspicious request patterns harshly. Make too many requests from one IP, and you'll see HTTP 429 responses, CAPTCHAs, or outright blocks. Residential and datacenter proxies distribute your traffic across multiple IP addresses, making your requests appear to come from different users.

Java's HTTP client landscape is fragmented. The built-in java.net.http.HttpClient (Java 11+) is excellent for modern projects, but OkHttp dominates Android and many backend services. Jsoup is the go-to for HTML parsing. Apache HttpClient still powers legacy enterprise systems. Each has its own proxy configuration API—this guide covers them all.

Java 11+ HttpClient with ProxySelector

The java.net.http.HttpClient introduced in Java 11 is the modern standard for HTTP in the JDK. It supports HTTP/2, WebSockets, and async operations natively. Proxy configuration uses ProxySelector—a flexible approach that lets you route different requests through different proxies.

Basic Proxy Configuration

Here's a complete example using ProxyHat residential proxies with Java's HttpClient:

import java.net.InetSocketAddress;
import java.net.Proxy;
import java.net.ProxySelector;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.time.Duration;
import java.util.List;

public class HttpClientProxyExample {

    private static final String PROXY_HOST = "gate.proxyhat.com";
    private static final int PROXY_PORT = 8080;

    public static void main(String[] args) throws Exception {
        // Create a ProxySelector that routes all HTTP requests through the proxy
        ProxySelector proxySelector = new ProxySelector() {
            @Override
            public List<Proxy> select(URI uri) {
                // Return HTTP proxy for HTTP/HTTPS URIs
                Proxy proxy = new Proxy(Proxy.Type.HTTP,
                    new InetSocketAddress(PROXY_HOST, PROXY_PORT));
                return List.of(proxy);
            }

            @Override
            public void connectFailed(URI uri, SocketAddress sa, IOException ioe) {
                System.err.println("Proxy connection failed: " + ioe.getMessage());
            }
        };

        // Build the HttpClient with proxy and timeouts
        HttpClient client = HttpClient.newBuilder()
            .proxy(proxySelector)
            .connectTimeout(Duration.ofSeconds(10))
            .followRedirects(HttpClient.Redirect.NORMAL)
            .build();

        HttpRequest request = HttpRequest.newBuilder()
            .uri(URI.create("https://httpbin.org/ip"))
            .timeout(Duration.ofSeconds(30))
            .GET()
            .build();

        HttpResponse<String> response = client.send(request,
            HttpResponse.BodyHandlers.ofString());

        System.out.println("Status: " + response.statusCode());
        System.out.println("Body: " + response.body());
    }
}

Authenticated Proxies with Authenticator

Most proxy services require authentication. Java HttpClient handles this via the Authenticator class. The authenticator receives authentication challenges and responds with credentials:

import java.net.Authenticator;
import java.net.InetSocketAddress;
import java.net.PasswordAuthentication;
import java.net.Proxy;
import java.net.ProxySelector;
import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
import java.time.Duration;
import java.util.List;

public class AuthenticatedProxyExample {

    private static final String PROXY_HOST = "gate.proxyhat.com";
    private static final int PROXY_PORT = 8080;
    private static final String USERNAME = "your-username";
    private static final String PASSWORD = "your-password";

    public static void main(String[] args) throws Exception {
        // ProxySelector for HTTP proxy
        ProxySelector proxySelector = new ProxySelector() {
            @Override
            public List<Proxy> select(URI uri) {
                Proxy proxy = new Proxy(Proxy.Type.HTTP,
                    new InetSocketAddress(PROXY_HOST, PROXY_PORT));
                return List.of(proxy);
            }

            @Override
            public void connectFailed(URI uri, SocketAddress sa, IOException ioe) {
                System.err.println("Connection failed: " + ioe.getMessage());
            }
        };

        // Authenticator provides credentials for proxy authentication
        Authenticator authenticator = new Authenticator() {
            @Override
            protected PasswordAuthentication getPasswordAuthentication() {
                // Check if this is a proxy authentication challenge
                if (getRequestorType() == RequestorType.PROXY) {
                    return new PasswordAuthentication(USERNAME, PASSWORD.toCharArray());
                }
                return null;
            }
        };

        HttpClient client = HttpClient.newBuilder()
            .proxy(proxySelector)
            .authenticator(authenticator)
            .connectTimeout(Duration.ofSeconds(10))
            .build();

        HttpRequest request = HttpRequest.newBuilder()
            .uri(URI.create("https://httpbin.org/ip"))
            .header("User-Agent", "Java-ProxyHat-Client/1.0")
            .GET()
            .build();

        HttpResponse<String> response = client.send(request,
            HttpResponse.BodyHandlers.ofString());

        System.out.println("Response: " + response.body());
    }
}

Geo-targeting tip: ProxyHat lets you specify country or city in the username. Use user-country-US:password for US IPs, or user-country-DE-city-berlin:password for Berlin-specific addresses. This is useful for location-dependent scraping.

OkHttp Proxy Configuration

OkHttp is Square's widely-used HTTP client, especially popular in Android development and microservices. It has a clean API for proxy configuration and a robust interceptor system for retries and logging.

Basic OkHttp Proxy Setup

import java.io.IOException;
import java.net.InetSocketAddress;
import java.net.Proxy;
import java.util.concurrent.TimeUnit;
import okhttp3.Authenticator;
import okhttp3.Credentials;
import okhttp3.OkHttpClient;
import okhttp3.Request;
import okhttp3.Response;
import okhttp3.Route;

public class OkHttpProxyExample {

    private static final String PROXY_HOST = "gate.proxyhat.com";
    private static final int PROXY_PORT = 8080;
    private static final String USERNAME = "your-username";
    private static final String PASSWORD = "your-password";

    public static void main(String[] args) throws IOException {
        // Define the proxy
        Proxy proxy = new Proxy(Proxy.Type.HTTP,
            new InetSocketAddress(PROXY_HOST, PROXY_PORT));

        // Authenticator for proxy authentication
        Authenticator proxyAuthenticator = new Authenticator() {
            @Override
            public Request authenticate(Route route, Response response) throws IOException {
                // Prevent infinite auth loops
                if (response.responseCount() >= 3) {
                    return null;
                }
                String credential = Credentials.basic(USERNAME, PASSWORD);
                return response.request().newBuilder()
                    .header("Proxy-Authorization", credential)
                    .build();
            }
        };

        // Build client with proxy, timeouts, and connection pool
        OkHttpClient client = new OkHttpClient.Builder()
            .proxy(proxy)
            .proxyAuthenticator(proxyAuthenticator)
            .connectTimeout(10, TimeUnit.SECONDS)
            .readTimeout(30, TimeUnit.SECONDS)
            .writeTimeout(30, TimeUnit.SECONDS)
            .retryOnConnectionFailure(true)
            .build();

        Request request = new Request.Builder()
            .url("https://httpbin.org/ip")
            .header("User-Agent", "OkHttp-ProxyHat/1.0")
            .build();

        try (Response response = client.newCall(request).execute()) {
            System.out.println("Status: " + response.code());
            System.out.println("Body: " + response.body().string());
        }
    }
}

Connection Pooling and Retry Policy

OkHttp's ConnectionPool reuses TCP connections, reducing latency for repeated requests to the same host. Configure it explicitly for high-throughput scenarios:

import okhttp3.ConnectionPool;
import okhttp3.OkHttpClient;
import java.util.concurrent.TimeUnit;

public class OkHttpPooledClient {

    public static OkHttpClient createPooledClient(Proxy proxy,
            Authenticator proxyAuthenticator) {

        // Connection pool: max 50 idle connections, 5-minute keep-alive
        ConnectionPool connectionPool = new ConnectionPool(50, 5, TimeUnit.MINUTES);

        return new OkHttpClient.Builder()
            .proxy(proxy)
            .proxyAuthenticator(proxyAuthenticator)
            .connectionPool(connectionPool)
            .connectTimeout(15, TimeUnit.SECONDS)
            .readTimeout(60, TimeUnit.SECONDS)
            .writeTimeout(60, TimeUnit.SECONDS)
            .retryOnConnectionFailure(true)
            // Add interceptor for logging (production: use proper logging)
            .addInterceptor(chain -> {
                long start = System.nanoTime();
                okhttp3.Response response = chain.proceed(chain.request());
                long elapsed = TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - start);
                System.out.printf("[%dms] %s -> %d%n",
                    elapsed, response.request().url(), response.code());
                return response;
            })
            .build();
    }
}

Jsoup Proxy Support for HTML Parsing

Jsoup is the de facto standard for HTML parsing in Java. It fetches and parses HTML in one step, making it ideal for web scraping. Jsoup doesn't have native proxy authentication support, so you'll need to handle authentication headers manually or use a system-level proxy.

import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import java.net.Proxy;
import java.net.InetSocketAddress;

public class JsoupProxyExample {

    private static final String PROXY_HOST = "gate.proxyhat.com";
    private static final int PROXY_PORT = 8080;
    private static final String USERNAME = "your-username";
    private static final String PASSWORD = "your-password";

    public static void main(String[] args) throws Exception {
        // Create HTTP proxy
        Proxy proxy = new Proxy(Proxy.Type.HTTP,
            new InetSocketAddress(PROXY_HOST, PROXY_PORT));

        // Jsoup doesn't support proxy auth natively, but ProxyHat supports
        // URL-embedded credentials or you can set system properties
        System.setProperty("http.proxyHost", PROXY_HOST);
        System.setProperty("http.proxyPort", String.valueOf(PROXY_PORT));
        System.setProperty("https.proxyHost", PROXY_HOST);
        System.setProperty("https.proxyPort", String.valueOf(PROXY_PORT));
        System.setProperty("http.proxyUser", USERNAME);
        System.setProperty("http.proxyPassword", PASSWORD);

        // Alternative: use OkHttp to fetch, Jsoup to parse
        Document doc = Jsoup.connect("https://example.com")
            .proxy(proxy)
            .userAgent("Mozilla/5.0 (compatible; Jsoup-ProxyHat/1.0)")
            .timeout(30000)
            .get();

        System.out.println("Title: " + doc.title());
        System.out.println("Links: " + doc.select("a[href]").size());
    }
}

Recommended Pattern: OkHttp + Jsoup

For production scrapers, fetch with OkHttp (for its superior proxy and retry handling), then parse with Jsoup:

import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import okhttp3.OkHttpClient;
import okhttp3.Request;
import okhttp3.Response;
import java.net.Proxy;
import java.net.InetSocketAddress;
import java.util.concurrent.TimeUnit;

public class OkHttpJsoupScraper {

    private final OkHttpClient client;

    public OkHttpJsoupScraper(String proxyHost, int proxyPort,
            String username, String password) {

        Proxy proxy = new Proxy(Proxy.Type.HTTP,
            new InetSocketAddress(proxyHost, proxyPort));

        this.client = new OkHttpClient.Builder()
            .proxy(proxy)
            .proxyAuthenticator((route, response) -> {
                String cred = okhttp3.Credentials.basic(username, password);
                return response.request().newBuilder()
                    .header("Proxy-Authorization", cred)
                    .build();
            })
            .connectTimeout(15, TimeUnit.SECONDS)
            .readTimeout(60, TimeUnit.SECONDS)
            .retryOnConnectionFailure(true)
            .build();
    }

    public Document fetchAndParse(String url) throws Exception {
        Request request = new Request.Builder()
            .url(url)
            .header("User-Agent", "Mozilla/5.0 (compatible; Scraper/1.0)")
            .header("Accept", "text/html,application/xhtml+xml")
            .build();

        try (Response response = client.newCall(request).execute()) {
            if (!response.isSuccessful()) {
                throw new RuntimeException("HTTP " + response.code());
            }
            String html = response.body().string();
            return Jsoup.parse(html, response.request().url().toString());
        }
    }

    public static void main(String[] args) throws Exception {
        OkHttpJsoupScraper scraper = new OkHttpJsoupScraper(
            "gate.proxyhat.com", 8080, "user-country-US", "password");

        Document doc = scraper.fetchAndParse("https://news.ycombinator.com");
        doc.select(".titleline > a").forEach(el ->
            System.out.println(el.text() + " -> " + el.attr("href"))
        );
    }
}

Apache HttpClient (Legacy Ecosystems)

Apache HttpClient 4.x and 5.x are still common in enterprise environments. Configuration is more verbose but offers fine-grained control:

import org.apache.hc.client5.http.auth.AuthScope;
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
import org.apache.hc.client5.http.classic.methods.HttpGet;
import org.apache.hc.client5.http.impl.auth.BasicProxyCredentialsProvider;
import org.apache.hc.client5.http.impl.classic.CloseableHttpClient;
import org.apache.hc.client5.http.impl.classic.HttpClients;
import org.apache.hc.client5.http.impl.io.PoolingHttpClientConnectionManager;
import org.apache.hc.client5.http.routing.HttpRoute;
import org.apache.hc.core5.http.HttpHost;
import org.apache.hc.core5.http.io.entity.EntityUtils;

public class ApacheHttpClientProxyExample {

    public static void main(String[] args) throws Exception {
        HttpHost proxy = new HttpHost("http", "gate.proxyhat.com", 8080);

        // Credentials provider for proxy auth
        BasicProxyCredentialsProvider credsProvider = new BasicProxyCredentialsProvider();
        credsProvider.setCredentials(
            new AuthScope(proxy),
            new UsernamePasswordCredentials("your-username", "your-password".toCharArray())
        );

        // Connection pooling
        PoolingHttpClientConnectionManager connManager =
            new PoolingHttpClientConnectionManager();
        connManager.setMaxTotal(100);
        connManager.setDefaultMaxPerRoute(20);

        try (CloseableHttpClient client = HttpClients.custom()
            .setProxy(proxy)
            .setDefaultCredentialsProvider(credsProvider)
            .setConnectionManager(connManager)
            .build()) {

            HttpGet request = new HttpGet("https://httpbin.org/ip");
            client.execute(request, response -> {
                System.out.println(EntityUtils.toString(response.getEntity()));
                return null;
            });
        }
    }
}

Parallel Scraping with ExecutorService

When scraping at scale, you'll want to distribute requests across multiple IPs concurrently. Here's a pattern using Java's ExecutorService with rotating residential proxies:

import java.net.InetSocketAddress;
import java.net.Proxy;
import java.util.List;
import java.util.concurrent.*;
import java.util.concurrent.atomic.AtomicInteger;
import okhttp3.Authenticator;
import okhttp3.Credentials;
import okhttp3.OkHttpClient;
import okhttp3.Request;
import okhttp3.Response;

public class ParallelScraper {

    private static final String PROXY_HOST = "gate.proxyhat.com";
    private static final int PROXY_PORT = 8080;
    private static final String USERNAME = "your-username";
    private static final String PASSWORD = "your-password";

    // Track success/failure rates
    private final AtomicInteger successCount = new AtomicInteger();
    private final AtomicInteger failureCount = new AtomicInteger();

    private final OkHttpClient client;
    private final ExecutorService executor;

    public ParallelScraper(int threadPoolSize) {
        Proxy proxy = new Proxy(Proxy.Type.HTTP,
            new InetSocketAddress(PROXY_HOST, PROXY_PORT));

        Authenticator auth = (route, response) -> {
            if (response.responseCount() >= 3) return null;
            return response.request().newBuilder()
                .header("Proxy-Authorization", Credentials.basic(USERNAME, PASSWORD))
                .build();
        };

        this.client = new OkHttpClient.Builder()
            .proxy(proxy)
            .proxyAuthenticator(auth)
            .connectTimeout(15, TimeUnit.SECONDS)
            .readTimeout(30, TimeUnit.SECONDS)
            .retryOnConnectionFailure(true)
            .build();

        this.executor = Executors.newFixedThreadPool(threadPoolSize);
    }

    public CompletableFuture<String> fetchAsync(String url) {
        return CompletableFuture.supplyAsync(() -> {
            Request request = new Request.Builder()
                .url(url)
                .header("User-Agent", "ParallelScraper/1.0")
                .build();

            try (Response response = client.newCall(request).execute()) {
                if (response.isSuccessful()) {
                    successCount.incrementAndGet();
                    return response.body().string();
                } else {
                    failureCount.incrementAndGet();
                    throw new RuntimeException("HTTP " + response.code());
                }
            } catch (Exception e) {
                failureCount.incrementAndGet();
                throw new CompletionException(e);
            }
        }, executor);
    }

    public void shutdown() {
        executor.shutdown();
        try {
            if (!executor.awaitTermination(60, TimeUnit.SECONDS)) {
                executor.shutdownNow();
            }
        } catch (InterruptedException e) {
            executor.shutdownNow();
        }
    }

    public static void main(String[] args) throws Exception {
        ParallelScraper scraper = new ParallelScraper(10);

        List<String> urls = List.of(
            "https://httpbin.org/ip",
            "https://httpbin.org/headers",
            "https://httpbin.org/user-agent",
            "https://httpbin.org/get",
            "https://httpbin.org/status/200"
        );

        // Fan out requests
        List<CompletableFuture<String>> futures = urls.stream()
            .map(scraper::fetchAsync)
            .toList();

        // Wait for all and collect results
        CompletableFuture.allOf(futures.toArray(new CompletableFuture[0])).join();

        futures.forEach(f -> {
            try {
                System.out.println("Result: " + f.get().substring(0, Math.min(100, f.get().length())));
            } catch (Exception e) {
                System.err.println("Failed: " + e.getCause().getMessage());
            }
        });

        scraper.shutdown();
        System.out.printf("Success: %d, Failures: %d%n",
            scraper.successCount.get(), scraper.failureCount.get());
    }
}

Production tip: For session stickiness (same IP for multiple requests), use ProxyHat's session flag: user-session-abc123:password. This is essential for multi-step workflows like login flows or shopping cart interactions.

TLS and Custom SSLContext

Some upstream servers use self-signed certificates or non-standard TLS configurations. Java's default SSLContext rejects these. Here's how to configure a custom SSLContext for development or testing:

import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;
import java.net.http.HttpClient;
import java.security.SecureRandom;
import java.security.cert.X509Certificate;

public class TlsConfiguration {

    /**
     * Creates an SSLContext that trusts all certificates.
     * WARNING: Only use for development/testing with known upstreams.
     */
    public static SSLContext createTrustAllContext() throws Exception {
        TrustManager[] trustAllCerts = new TrustManager[]{
            new X509TrustManager() {
                @Override
                public void checkClientTrusted(X509Certificate[] chain, String authType) {}
                @Override
                public void checkServerTrusted(X509Certificate[] chain, String authType) {}
                @Override
                public X509Certificate[] getAcceptedIssuers() { return new X509Certificate[0]; }
            }
        };

        SSLContext sslContext = SSLContext.getInstance("TLS");
        sslContext.init(null, trustAllCerts, new SecureRandom());
        return sslContext;
    }

    public static HttpClient createClientWithCustomSsl(SSLContext sslContext) {
        return HttpClient.newBuilder()
            .sslContext(sslContext)
            .connectTimeout(java.time.Duration.ofSeconds(10))
            .build();
    }

    // For OkHttp, use OkHttpClient.Builder.sslSocketFactory()
    public static okhttp3.OkHttpClient createOkHttpWithCustomSsl(SSLContext sslContext) {
        javax.net.ssl.SSLSocketFactory factory = sslContext.getSocketFactory();

        return new okhttp3.OkHttpClient.Builder()
            .sslSocketFactory(factory, (X509TrustManager) trustAllCerts[0])
            .hostnameVerifier((hostname, session) -> true)
            .build();
    }
}

For production, use proper certificate pinning or a custom truststore with only the certificates you expect:

import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManagerFactory;
import java.io.FileInputStream;
import java.security.KeyStore;

public class ProductionTls {

    public static SSLContext createTrustStoreContext(String trustStorePath,
            String trustStorePassword) throws Exception {

        KeyStore trustStore = KeyStore.getInstance("JKS");
        try (FileInputStream fis = new FileInputStream(trustStorePath)) {
            trustStore.load(fis, trustStorePassword.toCharArray());
        }

        TrustManagerFactory tmf = TrustManagerFactory.getInstance(
            TrustManagerFactory.getDefaultAlgorithm());
        tmf.init(trustStore);

        SSLContext sslContext = SSLContext.getInstance("TLSv1.3");
        sslContext.init(null, tmf.getTrustManagers(), new SecureRandom());
        return sslContext;
    }
}

Comparison: Java HTTP Clients for Proxy Usage

FeatureJava 11+ HttpClientOkHttpApache HttpClientJsoup
Proxy SupportProxySelectorProxy + AuthenticatorHttpHost + CredentialsProviderProxy (limited)
Proxy AuthAuthenticator classproxyAuthenticator()CredentialsProviderSystem props only
HTTP/2YesYesYes (5.x)No
Connection PoolingBuilt-inConnectionPoolPoolingHttpClientConnectionManagerNo
Async SupportCompletableFutureCall.enqueue()Future<HttpResponse>No
Best ForModern JDK appsAndroid, microservicesEnterprise legacySimple HTML scraping

Key Takeaways

  • Java 11+ HttpClient is the modern choice—use ProxySelector and Authenticator for proxy configuration.
  • OkHttp offers the best proxy auth handling and interceptor ecosystem; pair it with Jsoup for HTML parsing.
  • Connection pooling is essential for high-throughput scraping—configure pool size and keep-alive timeouts explicitly.
  • Session stickiness (same IP across requests) requires session flags in the proxy username: user-session-abc123:password.
  • Geo-targeting lets you appear from specific countries or cities—useful for localized content and price comparison.
  • TLS customization via SSLContext handles non-standard certificates, but use proper truststores in production.
  • Parallel scraping with ExecutorService or virtual threads (Java 21+) maximizes throughput across rotating proxies.

Conclusion

Java gives you multiple solid options for HTTP proxy integration. For new projects, start with Java 11+ HttpClient for its native async support and clean API. If you need more control over retries, interceptors, or Android compatibility, OkHttp is the battle-tested choice. For HTML scraping, combine OkHttp's robust fetching with Jsoup's excellent parsing.

When scaling to production workloads, remember: connection pooling, proper timeout configuration, and retry policies separate reliable scrapers from fragile ones. And always respect robots.txt, rate limits, and the target site's terms of service.

Ready to start scraping with residential proxies? Check ProxyHat pricing for plans that fit your scale, or explore our global proxy locations to see available geo-targeting options.

Ready to get started?

Access 50M+ residential IPs across 148+ countries with AI-powered filtering.

View PricingResidential Proxies
← Back to Blog