Request a demo

Request demo

Request demo

Request demo

Solutions

Platform

Explore

Request a demo

June 11, 2024

Store of Stores

Ivars Mucenieks

Team Lead, Planhat

To keep all of our Pinia stores in sync, we've created a central registry (a "store of stores") through which all our Pinia stores are created. Any updates from a store gets emitted to the registry and propagated to the other stores, which can subscribe to any events that are relevant to them. This way, we keep data across all our stores in sync continuously, using a simple and consistent pattern that's easy to follow, yet scales and allows for a lot of flexibility. Here's an insight into how we architected the solution.

Looking for pragmatic solutions to non-trivial challenges

Taking a highly opinionated and fixed position is always easier: it allows your brain to run on autopilot. But inevitably the best answers to non-trivial problems are found by combining different recipes into pragmatic solutions, rather than cleanly executing a well-defined plan from the start.

So, naturally we ended up with a solution where our Vue3 frontend uses a fairly large number of Pinia stores to store data and manage state on the frontend. Some of the datasets (like app settings or preferences) are small enough that we can keep a full local copy and only require a single dedicated store. In other cases, like lists of business objects - in our case, this might be companies or endusers - they can span millions and millions of rows and we only keep a subset of items in the frontend at any given point in time. Imagine for example a tabulated list of customers: each row in the table representing one customer, with columns for a few selected properties.


image.png

Of course, if we only had these two stores, we wouldn't need to spend any time thinking about this: any hack to make the stores update one another would do fine. But unfortunately that's not the case. We are operating with a very large number of stores and developers interacting with them, so it makes sense to spend some time thinking about how to solve the storage and sync problem generically, in a way that is simple and provides an easily repeatable pattern for our entire Vue3 app.

Store of Stores

Whenever we need to make some critical and conceptual decision on the app architecture, we involve multiple stakeholders from across our engineering team. Sometimes a problem statement is already enough to go on, and we hop on a quick call to discuss potential routes. Some version of this happens multiple times a week, if not daily.

In other cases, we do a lot more background research, with different developers preparing independent POCs and pitching their potential solution. This is how we decided between VueJS and Svelte a few years ago (we'll touch on this in another post!). And sometimes a problem is sufficiently complex to warrant complete focus, so we drop the daily task load and pick somewhere to meet and iterate together until we crack it.

When it came to solving our storage problem, we took the middle-road: a few team members prepped diagrams to share their reflections and describe their preferred solution, then we got on a call to hash it out. I find that when all the information is on the table and everyone's had the creative freedom to explore possible pathways, we tend to reach the same conclusion pretty quickly (most of the time…). Here's a snapshot of the different proposed routes:


Off the back of these proposals, we decided to created a central registry (a "Store of Stores") through which all our Pinia stores are created. Before we might have had something like this:


// stores/useUserStore.tsimport { defineStore } from 'pinia';
export const useUserStore = defineStore({});

But with the new pattern it would look like this instead. First - defining the the central store; and then defining how it’s used to create a new store somewhere else:


// All created stores
const initializedStores: any = {};

// Emit update function for all stores
const phStoreEmitter = <T>(obj: TUpdateObject<T>): void => {
  // cut-out for brevity

  for (const storeName in initializedStores) {
    const store = initializedStores[storeName]();
    if (store && _.isFunction(store.phStoreListener) && storeName !== obj.storeName) {
      store.phStoreListener(obj);
    }
  }
};

const phStore = <T>(storeName: EStoresNames, storeOptions: () => T) => {
  const store = defineStore(storeName, storeOptions);
  initializedStores[storeName] = store;
  return store;
};

export { phStore, phStoreEmitter }


import { phStore, phStoreEmitter } from "piniaStore/phStore";
export const useMeStore = phStore(EStoresNames.ME, () => {

 function phStoreListener(event: TUpdateObject) {


    if (event.action === EUpdateAction.UPDATED) {
      assign(event.object);
    }
  }

  function assign(updateObj: Partial<TProfile>) {
    const oldValue = _.cloneDeep(profile.value);
    phAssign(profile.value, updateObj);

    // Emit that a user got updated in this it's my profile, so other stores can listen to that
    phStoreEmitter({
      type

This new architecture automatically ensures that all the stores used across the app register in the central registry without adding any complexity when it comes to creating a simple store. It also keeps new code to a minimum for connected stores that need to emit and receive updates from other stores.

I also love that the connectedness is optional, meaning you can create a new store without having to worry about any of this, and then later on - as it evolves or you discover it needs to communicate with the other store - it becomes a trivial add-on.

Live Collaboration

Keeping data in sync between local stores is of course a must for creating a reasonable user experience.

But for a modern app founded on the notion of seamless collaboration around your data, we strive to keep ALL of your data in sync. Beyond updates by another local Pinia store, data can be changed by a team member on another computer, or even from a 3rd party tool outside of Planhat, like Salesforce, Hubspot, Zendesk or anything else that connects bi-directionally with our platform.

With our Store of Stores approach, we get this almost out of the box. All we need to do is set the API to react to changes in the data and send the update to relevant users over sockets. On the frontend, we only need to subscribe to these events from one place and emit the update as any other store would have done. The result is that - with no additional code - all the data across the whole app, and for all users, are now "live updated" and kept continuously in sync.


This last part is a fairly new feature in the latest version of Planhat and admittedly we still have some work to do on it. One notable enhancement would be an improved UI/UX that gives the user some better cues as to what is happening when data magically keeps updating on the screen based on updates from colleagues or 3rd party tools (see a prototype of this behaviour above).

Even if data across systems isn't updating particularly frequently, knowing that you're seeing it all live - and seeing these live updates in action - certainly creates a special feeling.

To keep all of our Pinia stores in sync, we've created a central registry (a "store of stores") through which all our Pinia stores are created. Any updates from a store gets emitted to the registry and propagated to the other stores, which can subscribe to any events that are relevant to them. This way, we keep data across all our stores in sync continuously, using a simple and consistent pattern that's easy to follow, yet scales and allows for a lot of flexibility. Here's an insight into how we architected the solution.

Looking for pragmatic solutions to non-trivial challenges

Taking a highly opinionated and fixed position is always easier: it allows your brain to run on autopilot. But inevitably the best answers to non-trivial problems are found by combining different recipes into pragmatic solutions, rather than cleanly executing a well-defined plan from the start.

So, naturally we ended up with a solution where our Vue3 frontend uses a fairly large number of Pinia stores to store data and manage state on the frontend. Some of the datasets (like app settings or preferences) are small enough that we can keep a full local copy and only require a single dedicated store. In other cases, like lists of business objects - in our case, this might be companies or endusers - they can span millions and millions of rows and we only keep a subset of items in the frontend at any given point in time. Imagine for example a tabulated list of customers: each row in the table representing one customer, with columns for a few selected properties.


image.png

Of course, if we only had these two stores, we wouldn't need to spend any time thinking about this: any hack to make the stores update one another would do fine. But unfortunately that's not the case. We are operating with a very large number of stores and developers interacting with them, so it makes sense to spend some time thinking about how to solve the storage and sync problem generically, in a way that is simple and provides an easily repeatable pattern for our entire Vue3 app.

Store of Stores

Whenever we need to make some critical and conceptual decision on the app architecture, we involve multiple stakeholders from across our engineering team. Sometimes a problem statement is already enough to go on, and we hop on a quick call to discuss potential routes. Some version of this happens multiple times a week, if not daily.

In other cases, we do a lot more background research, with different developers preparing independent POCs and pitching their potential solution. This is how we decided between VueJS and Svelte a few years ago (we'll touch on this in another post!). And sometimes a problem is sufficiently complex to warrant complete focus, so we drop the daily task load and pick somewhere to meet and iterate together until we crack it.

When it came to solving our storage problem, we took the middle-road: a few team members prepped diagrams to share their reflections and describe their preferred solution, then we got on a call to hash it out. I find that when all the information is on the table and everyone's had the creative freedom to explore possible pathways, we tend to reach the same conclusion pretty quickly (most of the time…). Here's a snapshot of the different proposed routes:


Off the back of these proposals, we decided to created a central registry (a "Store of Stores") through which all our Pinia stores are created. Before we might have had something like this:


// stores/useUserStore.tsimport { defineStore } from 'pinia';
export const useUserStore = defineStore({});

But with the new pattern it would look like this instead. First - defining the the central store; and then defining how it’s used to create a new store somewhere else:


// All created stores
const initializedStores: any = {};

// Emit update function for all stores
const phStoreEmitter = <T>(obj: TUpdateObject<T>): void => {
  // cut-out for brevity

  for (const storeName in initializedStores) {
    const store = initializedStores[storeName]();
    if (store && _.isFunction(store.phStoreListener) && storeName !== obj.storeName) {
      store.phStoreListener(obj);
    }
  }
};

const phStore = <T>(storeName: EStoresNames, storeOptions: () => T) => {
  const store = defineStore(storeName, storeOptions);
  initializedStores[storeName] = store;
  return store;
};

export { phStore, phStoreEmitter }


import { phStore, phStoreEmitter } from "piniaStore/phStore";
export const useMeStore = phStore(EStoresNames.ME, () => {

 function phStoreListener(event: TUpdateObject) {


    if (event.action === EUpdateAction.UPDATED) {
      assign(event.object);
    }
  }

  function assign(updateObj: Partial<TProfile>) {
    const oldValue = _.cloneDeep(profile.value);
    phAssign(profile.value, updateObj);

    // Emit that a user got updated in this it's my profile, so other stores can listen to that
    phStoreEmitter({
      type

This new architecture automatically ensures that all the stores used across the app register in the central registry without adding any complexity when it comes to creating a simple store. It also keeps new code to a minimum for connected stores that need to emit and receive updates from other stores.

I also love that the connectedness is optional, meaning you can create a new store without having to worry about any of this, and then later on - as it evolves or you discover it needs to communicate with the other store - it becomes a trivial add-on.

Live Collaboration

Keeping data in sync between local stores is of course a must for creating a reasonable user experience.

But for a modern app founded on the notion of seamless collaboration around your data, we strive to keep ALL of your data in sync. Beyond updates by another local Pinia store, data can be changed by a team member on another computer, or even from a 3rd party tool outside of Planhat, like Salesforce, Hubspot, Zendesk or anything else that connects bi-directionally with our platform.

With our Store of Stores approach, we get this almost out of the box. All we need to do is set the API to react to changes in the data and send the update to relevant users over sockets. On the frontend, we only need to subscribe to these events from one place and emit the update as any other store would have done. The result is that - with no additional code - all the data across the whole app, and for all users, are now "live updated" and kept continuously in sync.


This last part is a fairly new feature in the latest version of Planhat and admittedly we still have some work to do on it. One notable enhancement would be an improved UI/UX that gives the user some better cues as to what is happening when data magically keeps updating on the screen based on updates from colleagues or 3rd party tools (see a prototype of this behaviour above).

Even if data across systems isn't updating particularly frequently, knowing that you're seeing it all live - and seeing these live updates in action - certainly creates a special feeling.

To keep all of our Pinia stores in sync, we've created a central registry (a "store of stores") through which all our Pinia stores are created. Any updates from a store gets emitted to the registry and propagated to the other stores, which can subscribe to any events that are relevant to them. This way, we keep data across all our stores in sync continuously, using a simple and consistent pattern that's easy to follow, yet scales and allows for a lot of flexibility. Here's an insight into how we architected the solution.

Looking for pragmatic solutions to non-trivial challenges

Taking a highly opinionated and fixed position is always easier: it allows your brain to run on autopilot. But inevitably the best answers to non-trivial problems are found by combining different recipes into pragmatic solutions, rather than cleanly executing a well-defined plan from the start.

So, naturally we ended up with a solution where our Vue3 frontend uses a fairly large number of Pinia stores to store data and manage state on the frontend. Some of the datasets (like app settings or preferences) are small enough that we can keep a full local copy and only require a single dedicated store. In other cases, like lists of business objects - in our case, this might be companies or endusers - they can span millions and millions of rows and we only keep a subset of items in the frontend at any given point in time. Imagine for example a tabulated list of customers: each row in the table representing one customer, with columns for a few selected properties.


image.png

Of course, if we only had these two stores, we wouldn't need to spend any time thinking about this: any hack to make the stores update one another would do fine. But unfortunately that's not the case. We are operating with a very large number of stores and developers interacting with them, so it makes sense to spend some time thinking about how to solve the storage and sync problem generically, in a way that is simple and provides an easily repeatable pattern for our entire Vue3 app.

Store of Stores

Whenever we need to make some critical and conceptual decision on the app architecture, we involve multiple stakeholders from across our engineering team. Sometimes a problem statement is already enough to go on, and we hop on a quick call to discuss potential routes. Some version of this happens multiple times a week, if not daily.

In other cases, we do a lot more background research, with different developers preparing independent POCs and pitching their potential solution. This is how we decided between VueJS and Svelte a few years ago (we'll touch on this in another post!). And sometimes a problem is sufficiently complex to warrant complete focus, so we drop the daily task load and pick somewhere to meet and iterate together until we crack it.

When it came to solving our storage problem, we took the middle-road: a few team members prepped diagrams to share their reflections and describe their preferred solution, then we got on a call to hash it out. I find that when all the information is on the table and everyone's had the creative freedom to explore possible pathways, we tend to reach the same conclusion pretty quickly (most of the time…). Here's a snapshot of the different proposed routes:


Off the back of these proposals, we decided to created a central registry (a "Store of Stores") through which all our Pinia stores are created. Before we might have had something like this:


// stores/useUserStore.tsimport { defineStore } from 'pinia';
export const useUserStore = defineStore({});

But with the new pattern it would look like this instead. First - defining the the central store; and then defining how it’s used to create a new store somewhere else:


// All created stores
const initializedStores: any = {};

// Emit update function for all stores
const phStoreEmitter = <T>(obj: TUpdateObject<T>): void => {
  // cut-out for brevity

  for (const storeName in initializedStores) {
    const store = initializedStores[storeName]();
    if (store && _.isFunction(store.phStoreListener) && storeName !== obj.storeName) {
      store.phStoreListener(obj);
    }
  }
};

const phStore = <T>(storeName: EStoresNames, storeOptions: () => T) => {
  const store = defineStore(storeName, storeOptions);
  initializedStores[storeName] = store;
  return store;
};

export { phStore, phStoreEmitter }


import { phStore, phStoreEmitter } from "piniaStore/phStore";
export const useMeStore = phStore(EStoresNames.ME, () => {

 function phStoreListener(event: TUpdateObject) {


    if (event.action === EUpdateAction.UPDATED) {
      assign(event.object);
    }
  }

  function assign(updateObj: Partial<TProfile>) {
    const oldValue = _.cloneDeep(profile.value);
    phAssign(profile.value, updateObj);

    // Emit that a user got updated in this it's my profile, so other stores can listen to that
    phStoreEmitter({
      type

This new architecture automatically ensures that all the stores used across the app register in the central registry without adding any complexity when it comes to creating a simple store. It also keeps new code to a minimum for connected stores that need to emit and receive updates from other stores.

I also love that the connectedness is optional, meaning you can create a new store without having to worry about any of this, and then later on - as it evolves or you discover it needs to communicate with the other store - it becomes a trivial add-on.

Live Collaboration

Keeping data in sync between local stores is of course a must for creating a reasonable user experience.

But for a modern app founded on the notion of seamless collaboration around your data, we strive to keep ALL of your data in sync. Beyond updates by another local Pinia store, data can be changed by a team member on another computer, or even from a 3rd party tool outside of Planhat, like Salesforce, Hubspot, Zendesk or anything else that connects bi-directionally with our platform.

With our Store of Stores approach, we get this almost out of the box. All we need to do is set the API to react to changes in the data and send the update to relevant users over sockets. On the frontend, we only need to subscribe to these events from one place and emit the update as any other store would have done. The result is that - with no additional code - all the data across the whole app, and for all users, are now "live updated" and kept continuously in sync.


This last part is a fairly new feature in the latest version of Planhat and admittedly we still have some work to do on it. One notable enhancement would be an improved UI/UX that gives the user some better cues as to what is happening when data magically keeps updating on the screen based on updates from colleagues or 3rd party tools (see a prototype of this behaviour above).

Even if data across systems isn't updating particularly frequently, knowing that you're seeing it all live - and seeing these live updates in action - certainly creates a special feeling.

To keep all of our Pinia stores in sync, we've created a central registry (a "store of stores") through which all our Pinia stores are created. Any updates from a store gets emitted to the registry and propagated to the other stores, which can subscribe to any events that are relevant to them. This way, we keep data across all our stores in sync continuously, using a simple and consistent pattern that's easy to follow, yet scales and allows for a lot of flexibility. Here's an insight into how we architected the solution.

Looking for pragmatic solutions to non-trivial challenges

Taking a highly opinionated and fixed position is always easier: it allows your brain to run on autopilot. But inevitably the best answers to non-trivial problems are found by combining different recipes into pragmatic solutions, rather than cleanly executing a well-defined plan from the start.

So, naturally we ended up with a solution where our Vue3 frontend uses a fairly large number of Pinia stores to store data and manage state on the frontend. Some of the datasets (like app settings or preferences) are small enough that we can keep a full local copy and only require a single dedicated store. In other cases, like lists of business objects - in our case, this might be companies or endusers - they can span millions and millions of rows and we only keep a subset of items in the frontend at any given point in time. Imagine for example a tabulated list of customers: each row in the table representing one customer, with columns for a few selected properties.


image.png

Of course, if we only had these two stores, we wouldn't need to spend any time thinking about this: any hack to make the stores update one another would do fine. But unfortunately that's not the case. We are operating with a very large number of stores and developers interacting with them, so it makes sense to spend some time thinking about how to solve the storage and sync problem generically, in a way that is simple and provides an easily repeatable pattern for our entire Vue3 app.

Store of Stores

Whenever we need to make some critical and conceptual decision on the app architecture, we involve multiple stakeholders from across our engineering team. Sometimes a problem statement is already enough to go on, and we hop on a quick call to discuss potential routes. Some version of this happens multiple times a week, if not daily.

In other cases, we do a lot more background research, with different developers preparing independent POCs and pitching their potential solution. This is how we decided between VueJS and Svelte a few years ago (we'll touch on this in another post!). And sometimes a problem is sufficiently complex to warrant complete focus, so we drop the daily task load and pick somewhere to meet and iterate together until we crack it.

When it came to solving our storage problem, we took the middle-road: a few team members prepped diagrams to share their reflections and describe their preferred solution, then we got on a call to hash it out. I find that when all the information is on the table and everyone's had the creative freedom to explore possible pathways, we tend to reach the same conclusion pretty quickly (most of the time…). Here's a snapshot of the different proposed routes:


Off the back of these proposals, we decided to created a central registry (a "Store of Stores") through which all our Pinia stores are created. Before we might have had something like this:


// stores/useUserStore.tsimport { defineStore } from 'pinia';
export const useUserStore = defineStore({});

But with the new pattern it would look like this instead. First - defining the the central store; and then defining how it’s used to create a new store somewhere else:


// All created stores
const initializedStores: any = {};

// Emit update function for all stores
const phStoreEmitter = <T>(obj: TUpdateObject<T>): void => {
  // cut-out for brevity

  for (const storeName in initializedStores) {
    const store = initializedStores[storeName]();
    if (store && _.isFunction(store.phStoreListener) && storeName !== obj.storeName) {
      store.phStoreListener(obj);
    }
  }
};

const phStore = <T>(storeName: EStoresNames, storeOptions: () => T) => {
  const store = defineStore(storeName, storeOptions);
  initializedStores[storeName] = store;
  return store;
};

export { phStore, phStoreEmitter }


import { phStore, phStoreEmitter } from "piniaStore/phStore";
export const useMeStore = phStore(EStoresNames.ME, () => {

 function phStoreListener(event: TUpdateObject) {


    if (event.action === EUpdateAction.UPDATED) {
      assign(event.object);
    }
  }

  function assign(updateObj: Partial<TProfile>) {
    const oldValue = _.cloneDeep(profile.value);
    phAssign(profile.value, updateObj);

    // Emit that a user got updated in this it's my profile, so other stores can listen to that
    phStoreEmitter({
      type

This new architecture automatically ensures that all the stores used across the app register in the central registry without adding any complexity when it comes to creating a simple store. It also keeps new code to a minimum for connected stores that need to emit and receive updates from other stores.

I also love that the connectedness is optional, meaning you can create a new store without having to worry about any of this, and then later on - as it evolves or you discover it needs to communicate with the other store - it becomes a trivial add-on.

Live Collaboration

Keeping data in sync between local stores is of course a must for creating a reasonable user experience.

But for a modern app founded on the notion of seamless collaboration around your data, we strive to keep ALL of your data in sync. Beyond updates by another local Pinia store, data can be changed by a team member on another computer, or even from a 3rd party tool outside of Planhat, like Salesforce, Hubspot, Zendesk or anything else that connects bi-directionally with our platform.

With our Store of Stores approach, we get this almost out of the box. All we need to do is set the API to react to changes in the data and send the update to relevant users over sockets. On the frontend, we only need to subscribe to these events from one place and emit the update as any other store would have done. The result is that - with no additional code - all the data across the whole app, and for all users, are now "live updated" and kept continuously in sync.


This last part is a fairly new feature in the latest version of Planhat and admittedly we still have some work to do on it. One notable enhancement would be an improved UI/UX that gives the user some better cues as to what is happening when data magically keeps updating on the screen based on updates from colleagues or 3rd party tools (see a prototype of this behaviour above).

Even if data across systems isn't updating particularly frequently, knowing that you're seeing it all live - and seeing these live updates in action - certainly creates a special feeling.

Ivars Mucenieks

Team Lead, Planhat

Ivars is based in Latvia and joined Planhat as a developer back in 2019. Demonstrating exceptional output and technical depth he has taken on lead roles in several major frontend projects over the past few years and is part of the tech management team.

Don't miss these

Perspective

4 minute read

Our Next Chapter

Kaveh Rostampor

Perspective

6 minute read

Combining Intuitiveness and Power

Daniel Sternegard

Press Release

3 minute read

Recognizing Planhat as a leader in Customer Success Software

Christine Nielsen

Don't miss these

Perspective

4 minute read

Our Next Chapter

Kaveh Rostampor

Perspective

6 minute read

Combining Intuitiveness and Power

Daniel Sternegard

Press Release

3 minute read

Recognizing Planhat as a leader in Customer Success Software

Christine Nielsen

Don't miss these

Perspective

4 minute read

Our Next Chapter

Kaveh Rostampor

Perspective

6 minute read

Combining Intuitiveness and Power

Daniel Sternegard

Press Release

3 minute read

Recognizing Planhat as a leader in Customer Success Software

Christine Nielsen

Don't miss these

Perspective

4 minute read

Our Next Chapter

Kaveh Rostampor

Perspective

6 minute read

Combining Intuitiveness and Power

Daniel Sternegard

Press Release

3 minute read

Recognizing Planhat as a leader in Customer Success Software

Christine Nielsen