Skip to content

Conversation

@alimorgaan
Copy link

Add Laravel Octane Support by Tracking Request ID

Problem

Laravel Octane keeps the application in memory across multiple requests for improved performance. As documented in the Laravel Octane documentation:

Since Octane boots your application once and keeps it in memory while serving requests, there are a few caveats you should consider while building your application. For example, the register and boot methods of your application's service providers will only be executed once when the request worker initially boots. On subsequent requests, the same application instance will be reused.

This creates a critical issue for Sushi models: the bootSushi() method is only called once when the worker process starts, and never gets called again on subsequent requests.

Impact

  • getRows() is never called again after the initial boot, meaning models serve stale data
  • Models cannot refresh their data between requests even if the underlying data source changes

Solution

This PR implements a request-aware connection resolver that detects when a new request starts and re-boots Sushi, ensuring getRows() and the migration process run fresh for each request.

Changes

  1. Added request tracking: Introduced $currentRequestId static property to track the current request
  2. Implemented request detection: Created getCurrentRequestId() method that:
    • Uses spl_object_hash() to generate unique IDs for each request instance (Octane-aware)
    • Falls back to REQUEST_TIME_FLOAT
  3. Modified resolveConnection(): Now checks if the request has changed and re-boots Sushi when needed

How It Works

public static function resolveConnection($connection = null)
{
    $requestId = static::getCurrentRequestId();

    // If request changed OR never booted, reset & boot
    if (static::$currentRequestId !== $requestId || static::$sushiConnection === null) {
        static::$currentRequestId = $requestId;
        static::$sushiConnection = null;
        static::bootSushi();
    }

    return static::$sushiConnection;
}

- Add request ID tracking to detect when new requests start in Octane
- Implement resolveConnection() to re-boot Sushi on each new request
- Use spl_object_hash() for Octane-aware request detection
- Reset connection state when request changes to prevent stale data

This fixes issues where bootSushi() is only called once per worker
process in Octane, causing Sushi models to use stale data
across multiple requests.
@dimitri-koenig
Copy link

I understand your argument, but in ALL of my cases data never changes, that's why I use Sushi. So that would mean a (arguably small) performance penalty.

Wouldn't it make more sense to make this optional?

@boryn
Copy link

boryn commented Nov 27, 2025

Hi! So for contrast, we use Sushi to populate dynamic models based on user configs, so in our case the data constantly changes :) We don't use (yet) Octane, but this would be an important improvement to the library.

@alimorgaan
Copy link
Author

I understand your argument, but in ALL of my cases data never changes, that's why I use Sushi. So that would mean a (arguably small) performance penalty.

Wouldn't it make more sense to make this optional?

Thanks, @dimitri-koenig , totally get your point. For static datasets, refreshing on every request would be unnecessary overhead.

In my case (and similar to what @boryn mentioned), I’m using Sushi to build a model dynamically from two other models, so the data changes constantly. That’s why the refresh is important for us.

To support both use cases, I added a sushi.refresh_on_request configuration option so the behavior is fully controllable.

@dimitri-koenig
Copy link

dimitri-koenig commented Nov 27, 2025

That would work. Don't want to be picky here but wouldn't it make more sense to add that switch per model? Like a protected bool $octaneRefreshOnReload = false property and then switch it to true if needed?

Then you would cover all cases: those models which should not reload and those models which actually should.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants