All systems operationalโ€ขIP pool status
Coronium Mobile Proxies
Integration Guide
Updated April 18, 2026
Rental retiring Oct 2026

Apify Proxy Integration with Mobile IPs (2026 Guide)

A working engineer's guide to plugging Coronium 4G/5G mobile proxies into Apify Actors. We cover the 2026 pay-per-event migration, marketplace economics, Python and JavaScript SDK configuration, LlamaIndex and LangChain integrations, and how to ship a Coronium-powered Actor on the Apify Store before the rental model is retired on October 1, 2026.

Actors Migrated
2,000+
to pay-per-event
Headquarters
Prague
Czech Republic
Team Size
155
employees (2025)
2024 Revenue
$13.3M
reported revenue

What Is Apify?

Apify is the largest marketplace of pre-built web scrapers on the internet plus the cloud infrastructure that runs them. A scraper on Apify is called an Actor: a containerized program with a declared input schema, output dataset, and a billing plan that the Apify platform meters automatically. Users either consume Actors that other developers publish on the Store, or they build and deploy their own Actors and optionally monetize them.

Company Snapshot

Founded2015
FoundersJan Curn, Jakub Balada
HQPrague, Czech Republic
Employees (2025)~155
2024 Revenue$13.3M
Total Funding$3.29M

Funding & Backers

Latest Round$2.98M
Round DateApril 15, 2024
Lead InvestorsJ&T Ventures
Also Backed ByReflex Capital
AcceleratorY Combinator
CategoryScraping Infra

Two Sides of the Platform

Apify operates on two sides simultaneously. Developers build Actors and monetize them on a marketplace that now contains several thousand public scrapers. End users, from solo operators to Fortune 500 data teams, run those Actors on demand without maintaining any infrastructure. Both sides share the same cloud runtime: Kubernetes nodes in multiple regions, a request queue service, key-value stores, dataset storage, and Apify's own residential and datacenter proxy pools.

For End Users

Run pre-built Actors for Instagram, TikTok, LinkedIn, Amazon, Google Maps, and thousands more

No infrastructure, no captcha solving, no proxy management required

Pay only for what you run: per-event, per-result, or per-usage

Schedule runs, trigger via webhook, export to S3, BigQuery, Zapier, Make

For Developers

Ship an Actor in Python or JavaScript and publish to the Store

Earn 80% of monthly rental fees or per-event charges

Creator Plan: $500/month of platform usage at $1/month for 6 months

Built-in dataset, request queue, key-value store, proxy, and logging

Pricing Migration 2026: Rental Is Retiring

Critical Deadlines

  • April 1, 2026: No new rental Actors will be accepted on the Apify Store. All newly published Actors must use pay-per-event, pay-per-result, or pay-per-usage pricing.
  • October 1, 2026: Rental pricing is fully retired. Any Actor still on a rental plan after this date stops billing users and must migrate to survive.

Apify spent most of 2024 and 2025 running the largest pricing model migration in the company's history. Rental pricing (a flat monthly fee for access to an Actor) was the original model but suffered from a classic subscription problem: users paid even when they did not run the scraper, and developers had no way to differentiate casual lookups from heavy crawls. Pay-per-event (PPE) replaces both problems with discrete billable events.

The Three Surviving Pricing Models

Pay-Per-Event (PPE)

The developer declares named events in actor.json, then calls Actor.charge(eventName, count) inside the Actor code whenever one happens. Typical events: page-opened, dataset-item-stored, api-call, captcha-solved, image-downloaded.

Example: $0.50 per 1,000 pages opened + $2.00 per 1,000 dataset items

Pay-Per-Result

Simpler: charge a fixed amount per item pushed to the default dataset. Users love the predictability; developers find it harder to price properly when results have radically different cost profiles.

Typical range: $1.00 - $10.00 per 1,000 results

Pay-Per-Usage

The raw platform model: users pay for compute units, bandwidth, and storage that the Actor consumes. The developer earns nothing extra. Best for internal Actors or open-source community scrapers.

Example: $0.40/CU compute, $0.25/GB bandwidth, $0.20/GB-month storage

Why More Than 2,000 Actors Already Migrated

Apify published migration data showing that developers who switched from pay-per-result to pay-per-event saw average revenue per Actor increase because PPE lets them price each expensive event (page loads, captcha solves, external API calls) separately instead of burying them all in one result price. As of early 2026, more than 2,000 Actors on the Store have migrated to PPE voluntarily before the rental deadline forces the rest.

Developer Economics Under PPE

  • Developer keeps 80% of billable event revenue
  • Apify keeps 20% to cover platform, payments, and fraud detection
  • Monthly payouts via Stripe Connect or Wise
  • Free-tier users consume your quota, but the 80/20 split only applies to paid events

Apify Actors Explained

An Apify Actor is a Docker container plus four metadata files that turn raw code into a monetizable product. Understanding the file layout is the single biggest lever for building a scraper that is easy to ship, easy to configure, and easy to price.

actor.json

Declares the Actor's name, version, input schema path, default pricing plan, memory and timeout defaults, and the list of billable events for PPE.

input_schema.json

JSON Schema that powers the Actor's input form on the Apify console. This is where you add a "Use my Coronium proxy" field if you want users to supply their own endpoint.

Dockerfile

Standard Docker build. Apify provides official base images for Python, Node, Chrome, Firefox, Playwright, and Selenium that shave minutes off cold starts.

README.md

Shown on the Store listing. Apify's SEO team ranks Actor READMEs aggressively, so a well-written README drives organic installs on its own.

What Actors Actually Cost in 2026

Actor CategoryTypical PricePricing ModelWhy That Price
Generic HTML scraper$1.00 / 1k resultsPer-resultLow compute, datacenter proxy
Google Maps scraper$4.00 / 1k placesPer-resultResidential proxy required
LinkedIn profile scraper$8.00 / 1k profilesPer-eventMobile proxy + anti-bot
TikTok video scraper$6.00 / 1k videosPer-event4G proxy ideal for mobile-first API
Instagram hashtag scraper$5.00 / 1k postsPer-eventMobile proxy mandatory
Amazon product scraper$3.00 / 1k itemsPer-resultResidential proxy sufficient
SERP scraper$2.50 / 1k SERPsPer-eventHeavy captcha load
Twitter / X scraper$10.00 / 1k tweetsPer-eventAPI rate limits + mobile proxy

Creator Plan: Ship Your First Actor Almost Free

Apify's Creator Plan grants $500 of platform usage every month for the first six months at a total cost of $1 per month. That covers compute, bandwidth, storage, and Apify Proxy. The catch: you must publish at least one public Actor within those six months, and it must survive a short review.

  • $500/month usage cap, rolling 30-day window
  • $1/month for the first six months, then standard pricing
  • Stackable with Store revenue (you keep 80% of user payments)

Apify Proxy vs Coronium BYOP

Apify ships its own proxy product (Apify Proxy) with datacenter, residential, and a limited mobile pool. For most generic scraping jobs Apify Proxy is the path of least resistance. For mobile-first targets, aggressive anti-bot stacks, and anything where CGNAT trust matters, bringing your own Coronium endpoint gives measurably better success rates.

Apify Proxy (native)

Zero configuration. Enable one flag in actor.json and Apify routes all traffic through its managed pool.

Datacenter pool free on paid plans

Residential pool at $8/GB

Built-in session rotation and sticky IPs

Mobile pool is small and per-country selection is limited

No dedicated-IP option for long sticky sessions

Coronium BYOP

Plug a single Coronium endpoint into ProxyConfiguration and scale to the Actor's concurrency limit without touching Apify's proxy pricing.

Real 4G/5G carrier IPs with full CGNAT trust

Dedicated IP per device (no shared pool)

On-demand IP rotation via HTTP endpoint

Flat monthly pricing, no bandwidth overages

Country-specific endpoints (US, UK, Europe, more)

Decision Matrix: Which One Should Your Actor Use?

TargetRecommended ProxyWhy
Instagram, TikTok, Facebook mobileCoronium 4GMobile-first endpoints aggressively block datacenter
LinkedIn, Indeed, GlassdoorCoronium 4GStrict fingerprinting + AS-number trust scoring
Google Search / SERPCoronium 4G or Apify residentialHeavy captcha load; mobile usually wins
Amazon, eBay, WalmartApify residential fineBot defenses tolerate quality residential
Generic news, blogs, docsApify datacenterCheapest option that still works
Regional price monitoringCoronium per-countryDeterministic geo-IP, no VPN fingerprint
API endpoints with mTLSCoronium dedicated IPWhitelisted IP required

Configuring Custom Proxy in actor.json

Every Actor declares its proxy requirements in actor.json. For BYOP, you either hardcode a Coronium endpoint (useful when you ship a Coronium-powered Actor) or you expose a proxy field in input_schema.json so the end user supplies their own. Here is the canonical shape of both.

actor.json with proxy requirement

{
  "actorSpecification": 1,
  "name": "coronium-powered-scraper",
  "version": "0.1",
  "buildTag": "latest",
  "title": "Coronium Mobile Proxy Scraper",
  "description": "Scrapes target URLs through Coronium 4G mobile IPs.",
  "dockerfile": "./Dockerfile",
  "input": "./input_schema.json",
  "storages": {
    "dataset": {
      "actorSpecification": 1,
      "views": {
        "default": {
          "title": "Results",
          "transformation": { "fields": ["url", "title", "scrapedAt"] }
        }
      }
    }
  },
  "minMemoryMbytes": 512,
  "maxMemoryMbytes": 4096,
  "defaultRunOptions": {
    "build": "latest",
    "timeoutSecs": 3600,
    "memoryMbytes": 2048
  },
  "meta": { "templateId": "python-start" },
  "pricingInfos": [
    {
      "pricingModel": "PAY_PER_EVENT",
      "pricingPerEvent": {
        "actorChargeEvents": {
          "page-opened": {
            "eventTitle": "Page opened",
            "eventDescription": "Charged for each page the Actor visits.",
            "eventPriceUsd": 0.0005
          },
          "dataset-item-stored": {
            "eventTitle": "Result stored",
            "eventDescription": "Charged for each item pushed to the dataset.",
            "eventPriceUsd": 0.002
          }
        }
      }
    }
  ]
}

input_schema.json with BYOP field

{
  "title": "Coronium Scraper Input",
  "type": "object",
  "schemaVersion": 1,
  "properties": {
    "startUrls": {
      "title": "Start URLs",
      "type": "array",
      "editor": "requestListSources",
      "description": "URLs the Actor will visit first."
    },
    "proxyConfiguration": {
      "title": "Proxy",
      "type": "object",
      "editor": "proxy",
      "description": "Choose Apify Proxy or supply Coronium URLs below.",
      "prefill": { "useApifyProxy": false }
    },
    "coroniumProxyUrls": {
      "title": "Coronium proxy URLs (BYOP)",
      "type": "array",
      "editor": "stringList",
      "description": "One or more Coronium endpoints, e.g. http://user:pass@us.coronium.io:30000",
      "default": []
    },
    "coroniumRotateUrl": {
      "title": "Coronium rotate endpoint",
      "type": "string",
      "editor": "textfield",
      "description": "Optional HTTP rotation URL provided in your Coronium dashboard.",
      "isSecret": true
    }
  },
  "required": ["startUrls"]
}

Secret Inputs

Always mark proxy credentials as isSecret: true. Apify encrypts secret input fields at rest and redacts them from run logs. Users can paste their Coronium username and password without worrying that another Actor developer sees them in a shared workspace.

Python SDK Integration

The Apify SDK for Python ships a ProxyConfiguration helper that accepts a list of proxy URLs and hands out the next one each time you call new_url(). Coronium endpoints fit the interface natively. The snippet below is a working Actor that pulls Coronium URLs from input, rotates for each request, and emits PPE charges.

# main.py - Apify Python SDK with Coronium BYOP
from apify import Actor, ProxyConfiguration
import httpx
from selectolax.parser import HTMLParser

async def main() -> None:
    async with Actor:
        actor_input = await Actor.get_input() or {}

        start_urls = actor_input.get("startUrls", [])
        coronium_urls = actor_input.get("coroniumProxyUrls", [])
        rotate_url = actor_input.get("coroniumRotateUrl")

        # Build a ProxyConfiguration from Coronium endpoints
        if coronium_urls:
            proxy_config = ProxyConfiguration(proxy_urls=coronium_urls)
        else:
            proxy_config = await Actor.create_proxy_configuration(
                groups=["RESIDENTIAL"],
                country_code="US",
            )

        async with httpx.AsyncClient(timeout=30) as session:
            for request in start_urls:
                url = request["url"] if isinstance(request, dict) else request
                proxy_url = await proxy_config.new_url()

                Actor.log.info(f"Fetching {url} via {proxy_url}")

                response = await session.get(
                    url,
                    proxies={"http://": proxy_url, "https://": proxy_url},
                    headers={"User-Agent": "Mozilla/5.0 Apify/Coronium"},
                )

                # Charge one page-opened event
                await Actor.charge("page-opened")

                if response.status_code == 200:
                    tree = HTMLParser(response.text)
                    title = tree.css_first("title").text() if tree.css_first("title") else None

                    await Actor.push_data({
                        "url": url,
                        "title": title,
                        "status": response.status_code,
                        "scrapedAt": Actor.datetime_now_iso(),
                    })
                    await Actor.charge("dataset-item-stored")

                # Optional: rotate Coronium IP for next request
                if rotate_url:
                    try:
                        await session.get(rotate_url, timeout=10)
                    except Exception as e:
                        Actor.log.warning(f"Rotation failed: {e}")

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

What the SDK handles for you

Round-robin URL selection with session stickiness

Automatic retries on proxy errors (502, 504, timeouts)

Secret redaction in logs

Charge deduplication inside a single request

Graceful shutdown on platform SIGTERM

Crawlee-Python variant

For larger projects swap httpx for Crawlee-Python:

from crawlee.playwright_crawler import PlaywrightCrawler
from apify import Actor, ProxyConfiguration

proxy_config = ProxyConfiguration(proxy_urls=coronium_urls)
crawler = PlaywrightCrawler(proxy_configuration=proxy_config)

Python Version Requirements

The Apify SDK for Python requires Python 3.9 or newer. As of SDK 2.x (current in 2026) async is mandatory; the old synchronous API is deprecated. Use the official base image apify/actor-python:3.12 to avoid cold-start surprises.

JavaScript SDK Integration

The JavaScript SDK mirrors the Python API almost field-for-field. Where it wins is the Crawlee ecosystem: PlaywrightCrawler, PuppeteerCrawler, CheerioCrawler, and JSDOMCrawler all accept a ProxyConfiguration instance directly.

// main.js - Apify JS SDK + Crawlee + Coronium BYOP
import { Actor } from 'apify';
import { PlaywrightCrawler, ProxyConfiguration } from 'crawlee';

await Actor.init();

const input = await Actor.getInput() ?? {};
const {
    startUrls = [],
    coroniumProxyUrls = [],
    coroniumRotateUrl,
    maxConcurrency = 5,
} = input;

// Build the proxy configuration from Coronium endpoints
const proxyConfiguration = coroniumProxyUrls.length > 0
    ? new ProxyConfiguration({ proxyUrls: coroniumProxyUrls })
    : await Actor.createProxyConfiguration({
        groups: ['RESIDENTIAL'],
        countryCode: 'US',
    });

const crawler = new PlaywrightCrawler({
    proxyConfiguration,
    maxConcurrency,
    launchContext: {
        launchOptions: {
            headless: true,
            args: ['--disable-blink-features=AutomationControlled'],
        },
    },
    async requestHandler({ page, request, log }) {
        log.info(`Scraping ${request.url}`);
        await page.waitForLoadState('domcontentloaded');

        const title = await page.title();
        const html = await page.content();

        await Actor.charge({ eventName: 'page-opened' });

        await Actor.pushData({
            url: request.url,
            title,
            htmlLength: html.length,
            scrapedAt: new Date().toISOString(),
        });

        await Actor.charge({ eventName: 'dataset-item-stored' });

        // Rotate Coronium IP between requests when an endpoint is supplied
        if (coroniumRotateUrl) {
            try {
                await fetch(coroniumRotateUrl, {
                    method: 'GET',
                    signal: AbortSignal.timeout(10_000),
                });
            } catch (err) {
                log.warning(`Coronium rotate failed: ${err.message}`);
            }
        }
    },
    failedRequestHandler({ request, log }, error) {
        log.error(`${request.url} failed: ${error.message}`);
    },
});

await crawler.run(startUrls);

await Actor.exit();

CheerioCrawler for pure HTML (no browser)

import { CheerioCrawler, ProxyConfiguration } from 'crawlee';
import { Actor } from 'apify';

const crawler = new CheerioCrawler({
    proxyConfiguration: new ProxyConfiguration({
        proxyUrls: [
            'http://user:pass@us.coronium.io:30000',
            'http://user:pass@us.coronium.io:30001',
            'http://user:pass@us.coronium.io:30002',
        ],
    }),
    maxRequestsPerMinute: 180,       // stay polite
    requestHandlerTimeoutSecs: 60,
    async requestHandler({ $, request }) {
        const title = $('title').text();
        const prices = $('.price').map((_, el) => $(el).text()).get();
        await Actor.pushData({ url: request.url, title, prices });
        await Actor.charge({ eventName: 'page-opened' });
    },
});

await crawler.run(startUrls);

Node Runtime Requirements

  • Apify SDK for JavaScript 3.x requires Node 18+; Node 20 recommended
  • Crawlee 3.x ships its own fingerprint generator that pairs cleanly with mobile IPs
  • Official base image: apify/actor-node-playwright-chrome:20
  • Use apify-cli to run locally: apify run -p

LlamaIndex + Apify

LlamaIndex ships an official ApifyActor loader that runs an Apify Actor on demand and feeds its dataset into a VectorStoreIndex. Combined with a Coronium-powered Actor you get a fully auditable RAG pipeline: the retrieval layer sees real mobile IPs, and LlamaIndex handles chunking, embedding, and retrieval.

# pip install llama-index apify-client
from llama_index.core import VectorStoreIndex, Document
from llama_index.readers.apify import ApifyActor

# Run the Coronium-powered scraper as an Actor
reader = ApifyActor("your-username/coronium-powered-scraper")

documents = reader.load_data(
    actor_id="your-username/coronium-powered-scraper",
    run_input={
        "startUrls": [
            {"url": "https://example.com/docs"},
            {"url": "https://example.com/blog"},
        ],
        "coroniumProxyUrls": [
            "http://user:pass@us.coronium.io:30000",
        ],
    },
    dataset_mapping_function=lambda item: Document(
        text=item.get("htmlContent") or item.get("title", ""),
        metadata={
            "url": item.get("url"),
            "scraped_at": item.get("scrapedAt"),
            "source": "coronium-apify",
        },
    ),
)

# Build a vector index over the freshly scraped content
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(similarity_top_k=5)

answer = query_engine.query("What changed in the pricing section last week?")
print(answer)

Why This Matters

  • Fresh corpus per query. The ApifyActor loader can be invoked at query time, meaning your LLM answers from data that is seconds old instead of whatever was crawled last week.
  • Proxy provenance. Every document ingested carries metadata about which Coronium IP fetched it. Useful for compliance and for debugging geographical content differences.
  • Cost control. Under PPE, LlamaIndex only pays for the pages it actually reads, not a flat monthly fee for an Actor it might never run.

LangChain + Apify

LangChain exposes two helpers for Apify: ApifyWrapper that triggers an Actor run and ApifyDatasetLoader that pulls results from an existing dataset ID. Both produce LangChain Document objects suitable for chains, agents, and retrievers.

# pip install langchain langchain-apify apify-client
from langchain.indexes import VectorstoreIndexCreator
from langchain_apify import ApifyWrapper
from langchain_core.documents import Document
from langchain_openai import OpenAIEmbeddings

apify = ApifyWrapper()

loader = apify.call_actor(
    actor_id="your-username/coronium-powered-scraper",
    run_input={
        "startUrls": [{"url": "https://news.example.com/latest"}],
        "coroniumProxyUrls": [
            "http://user:pass@us.coronium.io:30000",
            "http://user:pass@us.coronium.io:30001",
        ],
    },
    dataset_mapping_function=lambda item: Document(
        page_content=item["htmlContent"] or item["title"],
        metadata={
            "url": item["url"],
            "source": "coronium-apify",
            "scraped_at": item["scrapedAt"],
        },
    ),
)

# Vector index
index = VectorstoreIndexCreator(
    embedding=OpenAIEmbeddings(model="text-embedding-3-large"),
).from_loaders([loader])

result = index.query("Summarize today's top stories.")
print(result)

ApifyDatasetLoader for Pre-Existing Runs

from langchain_apify import ApifyDatasetLoader

# Already ran the Actor? Load the dataset directly.
loader = ApifyDatasetLoader(
    dataset_id="AbCdEfGhIj1234567",
    dataset_mapping_function=lambda x: Document(
        page_content=x["html"] or "",
        metadata={"url": x["url"]},
    ),
)

docs = loader.load()
print(f"Loaded {len(docs)} documents from Coronium-scraped dataset")

Production Tips

  • Set memory_mbytes=4096 on long-running Actor calls to avoid OOM during large retrievals
  • Use build="latest" only in dev; pin a specific build tag in production for reproducibility
  • Store the Apify token as a secret in LangSmith or Vault; never embed in source
  • Cache Actor runs with a hash key based on startUrls to avoid re-scraping identical inputs

Building Your Own Coronium-Powered Actor

The fastest path to shipping a Coronium-powered Actor on the Apify Store: use the official template, wire in the Coronium endpoint as an input, declare PPE events in actor.json, and push. Apify's build system compiles the Dockerfile in the cloud, so you do not need a local Docker daemon at all.

Step-by-step

# 1. Install the Apify CLI
npm install -g apify-cli

# 2. Log in (opens browser to fetch token)
apify login

# 3. Create a new Actor from template
apify create coronium-powered-scraper --template=python-crawlee-playwright

# 4. Move into the folder
cd coronium-powered-scraper

# 5. Edit .actor/actor.json to add PPE events (see earlier snippet)
# 6. Edit .actor/input_schema.json to add coroniumProxyUrls
# 7. Edit src/main.py to use ProxyConfiguration (see Python section)

# 8. Test locally with a real Coronium endpoint
apify run --purge --input='{
  "startUrls": [{"url": "https://example.com"}],
  "coroniumProxyUrls": ["http://user:pass@us.coronium.io:30000"]
}'

# 9. Push to Apify Cloud
apify push

# 10. (Optional) Publish to the Store
#    - Add screenshots, icon, category
#    - Submit for review
#    - Typical approval: 1-3 business days

Monetization Checklist

Must-Have for Store Publication

PPE, pay-per-result, or pay-per-usage pricing (no rental)

Clear README with at least one working example

Icon (512x512 PNG) and 3+ screenshots

Input schema with descriptions on every field

Sample dataset output (at least 5 rows)

Stripe Connect or Wise payout account

Nice-to-Have (Drives Installs)

Video demo embedded in README

Free-tier quota (e.g., 100 results free per user)

Integration examples with Zapier, Make, Sheets

Separate PPE events for expensive operations

Performance badge ("Scraped 10M+ pages in 2025")

Proxy comparison table in README

Hidden Costs to Price Against

When you bake a Coronium subscription into a Store Actor, the Actor's price must cover both Apify platform fees and your Coronium cost. A simple model:

  • Apify compute: ~$0.40 per compute unit (1 GB * 1 hour)
  • Apify takes 20% of your event revenue
  • Coronium flat fee per device (amortized across all users)
  • Price each event at 3-5x marginal cost to leave room for support and iteration

Frequently Asked Questions

Ship Your Apify Actor with Coronium Mobile IPs

Flat-rate 4G and 5G endpoints, dedicated IPs, on-demand rotation, and per-country targeting. Drop them into any Actor's ProxyConfiguration in one line.

Premium Mobile Proxy Pricing

Configure & Buy Mobile Proxies

Select from 10+ countries with real mobile carrier IPs and flexible billing options

Choose Billing Period

Select the billing cycle that works best for you

SELECT LOCATION

๐Ÿ‡บ๐Ÿ‡ธ
USA
$129/m
HOT
๐Ÿ‡ฌ๐Ÿ‡ง
UK
$97/m
HOT
๐Ÿ‡ซ๐Ÿ‡ท
France
$79/m
๐Ÿ‡ฉ๐Ÿ‡ช
Germany
$89/m
๐Ÿ‡ช๐Ÿ‡ธ
Spain
$96/m
๐Ÿ‡ณ๐Ÿ‡ฑ
Netherlands
$79/m
๐Ÿ‡ฆ๐Ÿ‡บ
Australia
$119/m
๐Ÿ‡ฎ๐Ÿ‡น
Italy
$127/m
๐Ÿ‡ง๐Ÿ‡ท
Brazil
$99/m
๐Ÿ‡จ๐Ÿ‡ฆ
Canada
$159/m
๐Ÿ‡ต๐Ÿ‡ฑ
Poland
$69/m
๐Ÿ‡ฎ๐Ÿ‡ช
Ireland
$59/m
๐Ÿ‡ฑ๐Ÿ‡น
Lithuania
$59/m
๐Ÿ‡ต๐Ÿ‡น
Portugal
$89/m
๐Ÿ‡ท๐Ÿ‡ด
Romania
$49/m
SALE
๐Ÿ‡บ๐Ÿ‡ฆ
Ukraine
$27/m
SALE
๐Ÿ‡ฌ๐Ÿ‡ช
Georgia
$69/m
SALE
๐Ÿ‡น๐Ÿ‡ญ
Thailand
$59/m
SALE
Save up to 10%

when you order 5+ proxy ports

Carrier & Region

USA ๐Ÿ‡บ๐Ÿ‡ธ

Available regions:

Florida
New York

Included Features

Dedicated Device
Real Mobile IP
10-100 Mbps Speed
Unlimited Data
ORDER SUMMARY

๐Ÿ‡บ๐Ÿ‡ธUSA Configuration

AT&T โ€ข Florida โ€ข Monthly Plan

Your price:

$129

/month

Unlimited Bandwidth

No commitment โ€ข Cancel anytime โ€ข Purchase guide

Money-back guarantee if not satisfied

Perfect For

Multi-account management
Web scraping without blocks
Geo-specific content access
Social media automation
500+Active Users
10+Countries
95%+Trust Score
20h/dSupport

Popular Proxy Locations

United Statesโ€ขCaliforniaโ€ขLos Angelesโ€ขNew Yorkโ€ขNYC

Secure payment methods accepted: Credit Card, PayPal, Bitcoin, and more. 2 free modem replacements per 24h.

Start Integrating Coronium + Apify Today

Whether you are migrating a rental Actor before October 2026 or building your first pay-per-event scraper, Coronium mobile proxies plug into the Apify SDK in one line of ProxyConfiguration. Get an endpoint, copy the snippets above, and ship before the migration deadline lands.