If your requirement is unidirectional (Server → Client) **Server-Sent Events (SSE)** is the protocol you should master. SSE runs over standard HTTP/1.1 or HTTP/2. It doesn’t require complex handshakes, works with standard authentication mechanisms, and native support is built into every modern browser via the**EventSource API.If your requirement is unidirectional (Server → Client) **Server-Sent Events (SSE)** is the protocol you should master. SSE runs over standard HTTP/1.1 or HTTP/2. It doesn’t require complex handshakes, works with standard authentication mechanisms, and native support is built into every modern browser via the**EventSource API.

Everything You Need to Know to Master Server-Sent Events (SSE) in Symfony 7.4

2025/12/10 04:00

I often see teams reach for WebSockets (via Socket.io or Pusher) when they simply need to update a UI with server-side state changes. While WebSockets are powerful, they are often overkill. If your requirement is unidirectional (Server → Client) — like stock tickers, progress bars, or notification feeds — Server-Sent Events (SSE) is the protocol you should master.

\ SSE runs over standard HTTP/1.1 or HTTP/2. It doesn’t require complex handshakes, works with standard authentication mechanisms, and native support is built into every modern browser via the EventSource API.

\ In this guide, we will implement SSE in Symfony 7.4 using two approaches:

  1. The Native Approach: Using StreamedResponse (Great for understanding the protocol and simple, low-concurrency tasks).
  2. The Production Approach: Using the Mercure protocol (The scalable, non-blocking standard for Symfony).

The Native Approach (StreamedResponse)

Symfony’s StreamedResponse allows you to keep an HTTP connection open and flush data to the client incrementally. This is the “raw” implementation of the SSE standard.

\ Create a new controller. We will use a generator or a loop to simulate streaming data.

\

namespace App\Controller; use Symfony\Bundle\FrameworkBundle\Controller\AbstractController; use Symfony\Component\HttpFoundation\StreamedResponse; use Symfony\Component\Routing\Attribute\Route; #[Route('/api/stream', name: 'api_stream_stock')] class StockStreamController extends AbstractController { #[Route('/stock/{symbol}', name: 'stock_ticker', methods: ['GET'])] public function stream(string $symbol): StreamedResponse { // 1. Create the StreamedResponse $response = new StreamedResponse(function () use ($symbol) { // 2. Prevent PHP time limits for long-running processes set_time_limit(0); // 3. Simple loop to simulate data stream // In a real app, you might check Redis or a Database here $i = 0; while (true) { // Connection check: Stop if client disconnected if (connection_aborted()) { break; } $data = [ 'symbol' => strtoupper($symbol), 'price' => rand(100, 200), 'timestamp' => date('c'), 'sequence' => $i++ ]; // 4. Format according to SSE Spec // Format: "data: {payload}\n\n" echo "event: stock_update\n"; // Optional event name echo 'data: ' . json_encode($data) . "\n\n"; // 5. Critical: Flush the buffer to send data immediately ob_flush(); flush(); // simulate delay sleep(2); } }); // 6. Set Headers specifically for SSE $response->headers->set('Content-Type', 'text/event-stream'); $response->headers->set('Cache-Control', 'no-cache'); $response->headers->set('X-Accel-Buffering', 'no'); // Crucial for Nginx return $response; } }

\ You do not need a library for this. The browser’s native EventSource is robust.

\

<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <title>Symfony SSE Stream</title> </head> <body> <h1>Live Ticker: <span id="symbol">LOADING...</span></h1> <div id="price" style="font-size: 2rem;">--</div> <ul id="log"></ul> <script> const symbol = 'AAPL'; // Connect to the Symfony endpoint const eventSource = new EventSource(`/api/stream/stock/${symbol}`); // Listen for the specific 'stock_update' event name defined in PHP eventSource.addEventListener('stock_update', (e) => { const data = JSON.parse(e.data); document.getElementById('symbol').innerText = data.symbol; document.getElementById('price').innerText = '$' + data.price; // Log for visibility const li = document.createElement('li'); li.innerText = `${data.timestamp}: $${data.price}`; document.getElementById('log').prepend(li); }); eventSource.onerror = (err) => { console.error("EventSource failed:", err); // EventSource auto-reconnects by default, but you can handle closure here }; </script> </body> </html>

Server Configuration (Critical)

This is where most developers fail. Web servers (nginx, etc.) love to buffer output to optimize compression. For SSE, buffering kills the stream (the client receives nothing until the buffer fills).

\ You must disable fastcgi_buffering or use the X-Accel-Buffering: no header we added in the controller.

\

# inside your location ~ \.php$ block fastcgi_split_path_info ^(.+\.php)(/.+)$; # ... standard config ... # Ensure buffering is off for SSE if headers fail fastcgi_buffering off;

Check your php.ini. Ensure output_buffering is set to Off or a low value.

Verification:

  1. Start your local server: symfony server:start

  2. Open your terminal.

  3. Use curl to verify the stream without a browser:

    \

curl -N -v http://127.0.0.1:8000/api/stream/stock/AAPL

\

  • -N: Disables buffering in curl.
  • Expected Output: You should see one JSON object appear every 2 seconds.

The “Senior” Reality Check (Why Native is Dangerous)

While the code above works, I rarely recommend it for high-traffic production apps.

\ PHP is synchronous. When a client connects to /api/stream/stock/AAPL, that PHP-FPM worker process enters a while(true) loopIt is locked.

\ If you have 50 PHP-FPM workers and 50 users open this page, your entire site goes down. No other requests can be processed.

\ It also keeps a database connection open if you aren’t careful.

\ You have to offload the open connection management to a dedicated service specifically designed for it. In the Symfony ecosystem, that solution is Mercure.

The Production Approach (Mercure)

Mercure is an open protocol for real-time updates. It acts as a Hub.

  1. Client connects to the Hub (not your PHP app) to listen for events.
  2. Symfony sends a single POST request to the Hub when data changes.
  3. Hub broadcasts the data to thousands of listeners.
  4. Result: Zero blocking PHP processes.

\ In the production environment, you cannot leave your stream public. You need to control who can listen to what.

\ Mercure uses JSON Web Tokens (JWT) for this.

  • Publisher JWT: Allows your Symfony app to push data (handled automatically by the bundle via MERCUREJWTSECRET).
  • Subscriber JWT: Allows a user’s browser to listen to private updates.

\ Here is how to implement the Subscriber JWT flow in Symfony 7.4.

Install JWT Library

We need a library to generate tokens manually for our users. lcobucci/jwt is the standard in the PHP ecosystem.

\

composer require lcobucci/jwt

The Token Generator Service

Create a service that generates a signed JWT for a specific user. This token will contain a mercure claim listing the topics the user is allowed to access.

\

//src/Service/MercureTokenGenerator.php namespace App\Service; use Lcobucci\JWT\Configuration; use Lcobucci\JWT\Signer\Hmac\Sha256; use Lcobucci\JWT\Signer\Key\InMemory; use Symfony\Component\DependencyInjection\Attribute\Autowire; class MercureTokenGenerator { private Configuration $config; public function __construct( // Inject the same secret used in your .env for the Mercure Hub #[Autowire('%env(MERCURE_JWT_SECRET)%')] private string $secret ) { $this->config = Configuration::forSymmetricSigner( new Sha256(), InMemory::plainText($this->secret) ); } public function generate(string $userTopic): string { return $this->config->builder() ->withClaim('mercure', [ 'subscribe' => [ $userTopic, // Allow access to this specific topic // 'https://mysite.com/books/{id}' // You can add patterns here ] ]) ->getToken($this->config->signer(), $this->config->signingKey()) ->toString(); } }

Injecting the Token (The Cookie Method)

The native browser EventSource API does not support sending HTTP Headers (like Authorization: Bearer).

\ To solve this without adding heavy JavaScript polyfills, we use a Cookie. The Mercure Hub automatically looks for a cookie named mercureAuthorization.

\

//src/Controller/StockPageController.php namespace App\Controller; use App\Service\MercureTokenGenerator; use Symfony\Bundle\FrameworkBundle\Controller\AbstractController; use Symfony\Component\HttpFoundation\Cookie; use Symfony\Component\HttpFoundation\Response; use Symfony\Component\Routing\Attribute\Route; class StockPageController extends AbstractController { #[Route('/dashboard', name: 'app_dashboard')] public function index(MercureTokenGenerator $tokenGenerator): Response { // 1. Define the private topic this user is allowed to see $userTopic = 'https://mysite.com/user/123/alerts'; // 2. Generate the JWT $token = $tokenGenerator->generate($userTopic); // 3. Create the response (render your Twig template) $response = $this->render('dashboard/index.html.twig', [ 'userTopic' => $userTopic ]); // 4. Attach the JWT as a Cookie // The Hub must be on the same domain (or subdomain) for this to work $response->headers->setCookie(Cookie::create( 'mercureAuthorization', $token, 0, // Session cookie '/.well-known/mercure', // Path (critical: restricts cookie to Hub) null, // Domain (null = current domain) false, // Secure (true if HTTPS) true, // HttpOnly false, // Raw 'strict' // SameSite )); return $response; } }

\

Updating the Publisher

Now, update your publisher service to mark the update as Private.

// ... public function publishPrivateAlert(string $userId, string $message): void { $topic = "https://mysite.com/user/{$userId}/alerts"; $update = new Update( $topic, json_encode(['alert' => $message]), true // <--- TRUE marks this as Private ); $this->hub->publish($update); }

Client Side (No Changes Needed)

Because we used a Cookie, the JavaScript remains exactly the same. The browser will automatically send the mercureAuthorization cookie when it connects to the Hub.

// The browser sends the cookie automatically! const eventSource = new EventSource(hubUrl);

Verification

  1. Clear Cookies: Clear your browser cookies for localhost.
  2. Load Dashboard: Visit /dashboard. Inspect the Network tab. You should see a Set-Cookie header with mercureAuthorization.
  3. Check Connection: The EventSource request to the Hub should now be green (200 OK).
  4. Test Privacy: Try to curl the Hub directly without the cookie; you will receive a 401 Unauthorized for private topics.

Conclusion

As we have explored, implementing real-time data in Symfony 7.4 isn’t just about opening a connection; it is about choosing the architecture that matches your traffic patterns.

\ While the Native StreamedResponse offers a quick, dependency-free route for administrative tasks (like export progress bars) or low-traffic internal tools, it carries significant risks regarding PHP worker exhaustion. It is a synchronous solution in an asynchronous world.

\ For any user-facing feature — whether it’s a live dashboard, a notification center, or a collaborative tool — Mercure is the non-negotiable standard for the modern Symfony ecosystem. It completely decouples your application logic from the connection management, allowing your Symfony backend to remain stateless and performant while Caddy handles the heavy lifting of maintaining thousands of idle connections.

\ Mastering these patterns allows you to move beyond simple “request-response” lifecycles and build applications that feel alive and responsive.

Let’s Stay in Touch

Real-time architecture often brings hidden complexities regarding load balancing, reverse proxy configuration (Nginx/HAProxy), and security boundaries.

\ Let’s discuss how we can make your application real-time ready.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Upbit to Raise Cold Wallet Ratio to 99% Amid Liquidity Concerns

Upbit to Raise Cold Wallet Ratio to 99% Amid Liquidity Concerns

The post Upbit to Raise Cold Wallet Ratio to 99% Amid Liquidity Concerns appeared on BitcoinEthereumNews.com. South Korea’s largest cryptocurrency exchange, Upbit, announced plans to increase its cold wallet storage ratio to 99%, following a major security breach last month. The announcement comes as part of a comprehensive security overhaul following hackers’ theft of approximately 44.5 billion won ($31 million) in Solana-based assets on November 27. Upbit Strengthens Security After Second November 27 Breach According to operator Dunamu, Upbit currently maintains 98.33% of customer digital assets in cold storage as of late October, with only 1.67% held in hot wallets. The exchange stated it has completed a full wallet infrastructure overhaul and aims to reduce hot wallet holdings to below 1% in the coming months. Dunamu emphasized that customer asset protection remains Upbit’s top priority, with all breach-related losses covered by the company’s reserves. Sponsored Sponsored The breach marked Upbit’s second major hack on the same date six years ago. In 2019, North Korean hacking groups Lazarus and Andariel stole 342,000 ETH from the exchange’s hot wallet. This time, attackers drained 24 different Solana network tokens in just 54 minutes during the early morning hours. Under South Korea’s Virtual Asset User Protection Act, exchanges must store at least 80% of customer assets in cold wallets. Upbit significantly exceeds this threshold and maintains the lowest hot wallet ratio among domestic exchanges. Data released by lawmaker Huh Young showed that other Korean exchanges were operating with cold wallet ratios of 82% to 90% as of June. Upbit Outpaces Global Industry Standards Upbit’s security metrics compare favorably with those of major global exchanges. Coinbase stores approximately 98% of customer funds in cold storage, while Kraken maintains 95-97% of its funds offline. OKX, Gate.io, and MEXC each keep around 95% of their funds in cold wallets. Binance and Bybit have not disclosed specific ratios but emphasize that the majority of…
Share
BitcoinEthereumNews2025/12/10 13:37
Tidal Trust Files For ‘Bitcoin AfterDark ETF’, Could Off-Hours Trading Boost Returns?

Tidal Trust Files For ‘Bitcoin AfterDark ETF’, Could Off-Hours Trading Boost Returns?

The post Tidal Trust Files For ‘Bitcoin AfterDark ETF’, Could Off-Hours Trading Boost Returns? appeared on BitcoinEthereumNews.com. Tidal Trust has filed for the first Bitcoin AfterDark ETF with the U.S. SEC. The product looks to capture overnight price movements of the token. What Is the Bitcoin AfterDark ETF? Tidal Trust has filed with the SEC for its proposed Bitcoin AfterDark ETF product. It is an ETF that would hold the coin only during non-trading hours in the United States. This filing also seeks permission for two other BTC-linked products managed with Nicholas Wealth Management. Source: SEC According to the registration documents, the ETF would buy Bitcoin at the close of U.S. markets and then sell the position the following morning upon the reopening of trading. In other words, it will effectively hold BTC only over the night “The fund trades those instruments during U.S. overnight hours and closes them out shortly after the U.S. market opens each trading day,” the filing said. During the day, the fund’s assets switch to U.S. Treasuries, money-market funds, and similar cash instruments. That means even when the fund has 100% notional exposure to Bitcoin overnight, a substantial portion of its capital may still sit in Treasuries during the day. Eric Balchunas, senior ETF analyst cited earlier research and said, “most of Bitcoin’s gains historically occur outside U.S. market hours.” If those patterns persist, the Bitcoin AfterDark ETF token will outperform more traditional spot BTC products, he said. Source: X Balchunas added that the effect may be partly driven by positioning in existing Bitcoin ETFs and related derivatives activity. The SEC has of late taken an increasingly more accommodating approach toward crypto-related ETFs. This September, for instance, REX Shares launched the first Ethereum Staking ETF. It represented direct ETH exposure and paid out on-chain staking rewards.  Also on Tuesday, BlackRock filed an application for an iShares Staked Ethereum ETF. The filing states…
Share
BitcoinEthereumNews2025/12/10 13:00
Tempo Testnet Goes Live with Stablecoin Tools and Expanded Partners

Tempo Testnet Goes Live with Stablecoin Tools and Expanded Partners

The post Tempo Testnet Goes Live with Stablecoin Tools and Expanded Partners appeared on BitcoinEthereumNews.com. The Tempo testnet, developed by Stripe and Paradigm, is now live, enabling developers to run nodes, sync the chain, and test stablecoin features for payments. This open-source platform emphasizes scale, reliability, and integration, paving the way for instant settlements on a dedicated layer-1 blockchain. Tempo testnet launches with six core features, including stablecoin-native gas and fast finality, optimized for financial applications. Developers can create stablecoins directly in browsers using the TIP-20 standard, enhancing accessibility for testing. The project has secured $500 million in funding at a $5 billion valuation, with partners like Mastercard and Klarna driving adoption; Klarna launched a USD-pegged stablecoin last month. Discover the Tempo testnet launch by Stripe and Paradigm: test stablecoins, run nodes, and explore payment innovations on this layer-1 blockchain. Join developers in shaping the future of crypto payments today. What is the Tempo Testnet? Tempo testnet represents a pivotal milestone in the development of a specialized layer-1 blockchain for payments, created through a collaboration between Stripe and Paradigm. This public testnet allows participants to run nodes, synchronize the chain, and experiment with essential features tailored for stablecoin operations and financial transactions. By focusing on instant settlements and low fees, it addresses key limitations in traditional blockchains for real-world payment use cases. Source: Patrick Collison The Tempo testnet builds on the project’s foundation, which was first announced four months ago, with an emphasis on developer-friendly tools. It supports a range of functionalities that prioritize reliability and scalability, making it an ideal environment for testing before the mainnet rollout. As per the official announcement from Tempo, this phase will involve ongoing enhancements, including new infrastructure partnerships and stress tests under simulated payment volumes. One of the standout aspects of the Tempo testnet is its open-source nature, inviting broad community involvement. This approach not only accelerates development…
Share
BitcoinEthereumNews2025/12/10 13:01