For .NET developers building scrapers, automation tools, or price monitoring services, handling HTTP proxies is not optional—it's infrastructure. Between rotating residential IPs, managing sticky sessions, and dealing with TLS interception, the complexity grows quickly. .NET 8 introduces significant performance improvements and clearer APIs for HttpClient, making it the best version yet for proxy-intensive workloads.
This guide focuses on production patterns: configuring HttpClientHandler, building a rotating proxy pool with Dependency Injection, implementing resilience with Polly, and securing connections with custom TLS validation.
1. Basic Proxy Configuration with HttpClientHandler
The foundation of proxy usage in .NET is WebProxy combined with HttpClientHandler. While simple, this approach requires careful handling of credentials and proxy bypass rules to avoid leaking requests.
using System;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
public class BasicProxyExample
{
public static async Task<string> FetchWithProxyAsync()
{
// ProxyHat residential endpoint configuration
var proxyUrl = "http://gate.proxyhat.com:8080";
var proxy = new WebProxy(proxyUrl)
{
Credentials = new NetworkCredential(
userName: "user-country-US", // Geo-targeting in username
password: "YOUR_PROXYHAT_PASSWORD"
),
BypassProxyOnLocal = true,
BypassList = new[] { "localhost", "127.0.0.1" }
};
var handler = new HttpClientHandler
{
Proxy = proxy,
UseProxy = true,
AllowAutoRedirect = true,
MaxAutomaticRedirections = 5
};
// HttpClient should be reused, but for this example we use 'using'
using var client = new HttpClient(handler)
{
Timeout = TimeSpan.FromSeconds(30)
};
try
{
var response = await client.GetStringAsync("https://httpbin.org/ip");
return response;
}
catch (HttpRequestException ex)
{
Console.WriteLine($"Request failed: {ex.Message}");
throw;
}
}
}
This works for single-threaded scripts. However, for production scrapers, you'll need to manage connection pooling and proxy rotation at a higher level.
2. SocketsHttpHandler and Pooled Connection Lifetime
In .NET Core and .NET 5+, SocketsHttpHandler is the default internal handler. It offers fine-grained control over connection pooling, which is critical when working with rotating proxies. If you keep a connection open to a proxy for too long, the underlying IP may rotate, breaking your session.
using System;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
public class PooledProxyClient
{
private readonly HttpClient _client;
public PooledProxyClient(string username, string password)
{
var proxy = new WebProxy("http://gate.proxyhat.com:8080")
{
Credentials = new NetworkCredential(username, password)
};
// SocketsHttpHandler allows controlling connection lifetime
var handler = new SocketsHttpHandler
{
Proxy = proxy,
UseProxy = true,
PooledConnectionLifetime = TimeSpan.FromMinutes(2), // Recycle connections
PooledConnectionIdleTimeout = TimeSpan.FromMinutes(1),
MaxConnectionsPerServer = 10 // Concurrency limit per proxy
};
_client = new HttpClient(handler);
}
public async Task<string> GetAsync(string url)
{
// Connections older than 2 minutes will be drained and replaced
return await _client.GetStringAsync(url);
}
}
Setting PooledConnectionLifetime ensures that your client doesn't hold onto a stale proxy connection indefinitely. This is especially important for residential proxies that rotate IPs periodically.
3. Building a Rotating Proxy Pool Service with DI
Hardcoding credentials is fine for testing, but production applications need a Rotating Proxy Pool. This service manages a list of proxy configurations, rotates them, and integrates with .NET's Dependency Injection container.
using System;
using System.Collections.Concurrent;
using System.Net;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
public interface IProxyPoolService
{
HttpClient CreateClient(string? countryCode = null);
Task<string> FetchRotatingIpAsync();
}
public class ProxyPoolService : IProxyPoolService, IDisposable
{
private readonly ConcurrentDictionary<string, HttpClient> _clients = new();
private readonly string _baseUrl = "gate.proxyhat.com";
private readonly string _password;
private int _counter = 0;
public ProxyPoolService(string password)
{
_password = password;
}
// Creates or retrieves a client with specific geo-targeting
public HttpClient CreateClient(string? countryCode = null)
{
// Create a unique session ID for sticky sessions, or rotate
var sessionId = Guid.NewGuid().ToString("N").Substring(0, 8);
var username = string.IsNullOrEmpty(countryCode)
? $"user-session-{sessionId}"
: $"user-country-{countryCode}-session-{sessionId}";
// Use a factory pattern to manage HttpClient lifetimes
var proxy = new WebProxy($"http://{_baseUrl}:8080")
{
Credentials = new NetworkCredential(username, _password)
};
var handler = new SocketsHttpHandler
{
Proxy = proxy,
UseProxy = true,
PooledConnectionLifetime = TimeSpan.FromMinutes(5)
};
return new HttpClient(handler);
}
// Round-robin rotation example
public async Task<string> FetchRotatingIpAsync()
{
// Interlocked for thread-safe rotation
var current = Interlocked.Increment(ref _counter);
var username = $"user-rotate-{current % 100}"; // Example rotation logic
using var client = CreateClient();
return await client.GetStringAsync("https://api.ipify.org");
}
public void Dispose()
{
foreach (var client in _clients.Values)
{
client.Dispose();
}
_clients.Clear();
}
}
// Registration in Program.cs
// builder.Services.AddSingleton<IProxyPoolService>(sp => new ProxyPoolService("YOUR_PASSWORD"));
This service abstracts the complexity of session management. By injecting IProxyPoolService into your scrapers, you can request a fresh client with a specific geo-location or sticky session on demand.
4. High Concurrency with Parallel.ForEachAsync
Scraping at scale requires parallel execution. .NET 6 introduced Parallel.ForEachAsync, which is perfect for I/O-bound work like HTTP scraping. Combined with a rate-limited HttpClient, it allows you to process thousands of URLs efficiently.
using System;
using System.Collections.Concurrent;
using System.Net.Http;
using System.Threading.Tasks;
public class ParallelScraper
{
private readonly IProxyPoolService _proxyPool;
public ParallelScraper(IProxyPoolService proxyPool)
{
_proxyPool = proxyPool;
}
public async Task ScrapeAsync(string[] urls)
{
var results = new ConcurrentBag<string>();
var options = new ParallelOptions
{
MaxDegreeOfParallelism = 20 // Limit concurrency
};
await Parallel.ForEachAsync(urls, options, async (url, ct) =>
{
try
{
// Create a fresh client for each request or batch
// In production, reuse clients per proxy endpoint
using var client = _proxyPool.CreateClient("US");
var content = await client.GetStringAsync(url, ct);
results.Add($"{url}: {content.Length} chars");
Console.WriteLine($"Fetched {url}");
}
catch (Exception ex)
{
Console.WriteLine($"Error fetching {url}: {ex.Message}");
}
});
Console.WriteLine($"Completed. Total results: {results.Count}");
}
}
Note that we limit MaxDegreeOfParallelism. Without this, Parallel.ForEachAsync may spawn too many concurrent requests, leading to port exhaustion or proxy bans.
5. Resilience with Polly: Retries and Circuit Breakers
Network requests fail. Proxies get banned, IPs rotate, and servers return 429 Too Many Requests. Polly is the standard resilience library for .NET, providing retries, circuit breakers, and timeouts.
using System;
using System.Net;
using System.Net.Http;
using Polly;
using Polly.Extensions.Http;
public static class PollyPolicies
{
public static IAsyncPolicy<HttpResponseMessage> GetRetryPolicy()
{
return HttpPolicyExtensions
.HandleTransientHttpError() // 5xx and network failures
.OrResult(msg => msg.StatusCode == HttpStatusCode.TooManyRequests)
.WaitAndRetryAsync(3, retryAttempt =>
{
Console.WriteLine($"Retry {retryAttempt}");
return TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)); // Exponential backoff
});
}
public static IAsyncPolicy<HttpResponseMessage> GetCircuitBreakerPolicy()
{
return HttpPolicyExtensions
.HandleTransientHttpError()
.CircuitBreakerAsync(
handledEventsAllowedBeforeBreaking: 5,
durationOfBreak: TimeSpan.FromSeconds(30),
onBreak: (outcome, timeSpan) =>
{
Console.WriteLine($"Circuit broken for {timeSpan.TotalSeconds}s");
},
onReset: () => Console.WriteLine("Circuit reset")
);
}
}
// Usage in Program.cs with IHttpClientFactory
/*
builder.Services
.AddHttpClient<MyScraperService>(client =>
{
client.BaseAddress = new Uri("https://example.com");
})
.ConfigurePrimaryHttpMessageHandler(() => new SocketsHttpHandler
{
Proxy = new WebProxy("http://gate.proxyhat.com:8080")
{
Credentials = new NetworkCredential("user-country-US", "PASSWORD")
},
UseProxy = true
})
.AddPolicyHandler(PollyPolicies.GetRetryPolicy())
.AddPolicyHandler(PollyPolicies.GetCircuitBreakerPolicy());
*/
By combining retries with a circuit breaker, you prevent hammering a failing endpoint. When the circuit breaks, the application stops sending requests for 30 seconds, allowing the proxy service or target site to recover.
6. TLS, Certificate Pinning, and Custom Root CA
Some advanced scraping scenarios involve TLS inspection, MITM proxies, or self-signed certificates. By default, .NET validates SSL certificates strictly. To work with custom root CAs or perform certificate pinning, you must configure SslClientAuthenticationOptions via SocketsHttpHandler.
using System;
using System.Net.Http;
using System.Net.Security;
using System.Security.Authentication;
using System.Security.Cryptography.X509Certificates;
public class TlsPinnedClient
{
private readonly HttpClient _client;
public TlsPinnedClient(string proxyPassword)
{
var proxy = new WebProxy("http://gate.proxyhat.com:8080")
{
Credentials = new NetworkCredential("user-country-US", proxyPassword)
};
var handler = new SocketsHttpHandler
{
Proxy = proxy,
UseProxy = true,
SslOptions = new SslClientAuthenticationOptions
{
EnabledSslProtocols = SslProtocols.Tls12 | SslProtocols.Tls13,
// Allow all certificates for MITM debugging (INSECURE - use only for debugging)
RemoteCertificateValidationCallback = (sender, cert, chain, errors) =>
{
// Custom validation logic: Pin specific certificate or CA
if (errors == SslPolicyErrors.None) return true;
// Example: Trust a specific root CA
// var caCert = new X509Certificate2("path/to/root-ca.crt");
// chain.ChainPolicy.ExtraStore.Add(caCert);
// chain.Build(cert);
// ... validation logic
Console.WriteLine($"SSL Error: {errors}");
return true; // WARNING: Bypasses validation
}
}
};
_client = new HttpClient(handler);
}
public async Task<string> GetAsync(string url)
{
return await _client.GetStringAsync(url);
}
}
Security Warning: Bypassing SSL validation (return true) makes your application vulnerable to MITM attacks. In production, only bypass validation if you are using a trusted internal MITM proxy (like Squid with ssl_bump) and you verify the specific root certificate.
Comparing Proxy Types for .NET Applications
Choosing the right proxy type depends on your target site's anti-bot measures. Here is a comparison of residential, mobile, and datacenter proxies for C# projects:
| Feature | Datacenter Proxies | Residential Proxies | Mobile Proxies |
|---|---|---|---|
| Speed | Very Fast | Medium | Variable |
| Detection Risk | High | Low | Very Low |
| Cost | $ | $$ | $$$ |
| Best Use Case | High-volume, low-security targets | SERP, e-commerce, social media | Sneaker sites, ticketing, strict anti-bot |
| .NET Concurrency | High (100+ connections) | Medium (10-50 connections) | Low (1-5 connections) |
For most SERP scraping and price monitoring tasks, residential proxies offer the best balance. You can explore proxy locations to find geo-targeting options that suit your needs.
Key Takeaways
- Reuse HttpClient: Avoid creating new clients for every request. Use
SocketsHttpHandlerwithPooledConnectionLifetimeto manage connection health.- Rotate Credentials, Not Just IPs: Use the username field for session IDs and geo-targeting to rotate proxies dynamically without changing infrastructure code.
- Implement Resilience: Polly is essential for handling transient failures and rate limiting (HTTP 429).
- Limit Concurrency: Use
Parallel.ForEachAsyncwithMaxDegreeOfParallelismto avoid port exhaustion.- Handle TLS Explicitly: Configure
SslClientAuthenticationOptionsif you need to pin certificates or work with custom CAs.
Ready to integrate proxies into your .NET application? Check out ProxyHat pricing to get started with residential and mobile proxies designed for high-performance scraping.






