Loki Time Master - A service to enable time tracking for teams in Notion
Time tracking of team members, billing and reports have always been a pain. There are a thousand solutions available but probably no solution suits your specific needs. Since we have been leaning heavily on notion for the last 2 years and basically all our project management is done via Notion, we are trying to get rid of the last pieces of personal Excel files for time tracking and Kanban-Boards from other providers.
Our approach to company time tracking, billing and reporting in Notion.
To give that child a name, we call it “Loki”, inspired by Marvels Loki, who (spoiler alert) takes over the temporal loom to manage all the different timelines from the multiverse.
The situation
Currently we are collecting time log exports from notion project databases and merging them for project controlling and billing purposes. but it doesn’t allow us to get a feeling about the utilization of a single team member and the potential struggle handling multiple projects and other responsibilities.
- tracking tasks in different Notion database, each for a different project
- team members tracking project time on Time log databases for each project
- team members having personal Excel files for time tracking to have a complete picture for themselves and to have one point for data entry
- people then regularly sync their personal time sheets with the various project boards to enable billing
The challenge
This setup worked initially, but as our team grew, cracks began to show—leading to the challenges we faced
- team members insist on a full picture for themselves including internal, non-project time
- double effort tracking time in personal files and syncing project time logs
- no complete picture on time spend for company management
- Notion automations seem to have limitations when it comes to updating related items and are a little vulnerable to user error (rows can be added and deleted VERY easily)
The solution approach
- each team member can have their own personal notion time log database
- the personal time log is connected to the project tasks form various project boards, but only from the projects that they participate in
- personal time logs are aggregated automatically to get a complete picture over the whole company and to facilitate billing and reports
But Notion currently has limitations
Since Notion currently lacks sophisticated database and programming functions, we discovered the need to develop an external service that handles and syncs the Notion data. We all hope we can get rid of that in the future as Notion makes progress. We tried to keep this seperate service as thin as possible to keep maintenance effort low.
The external service that we are developing
basically we are aggregating time logs, storing them in a database and making them available for further processing and reporting via exports into Excel or our custom grid.
To do so, we need more than just time logs
- synchronize customers, projects and users: Notion → Loki
- synchronize tasks from projects: Notion → Loki
- synchronize and backup time logs with the optional connection to tasks: Notion → Loki
- distribute project tasks to personal task databases of project members to allow linking of personal time logs: Notion → Notion via Loki
and all that comes with limitations in Notion
Notion API allows 3 requests per second and querying of at most 100 items per database request. Synchronizing all those items, which get more with time, needs to be handled carefully and takes some time to process.
Solution design
So now we have our requirements. Users should have their personal time log table, that is connected to their personal task table, so that the task dropdown only shows tasks that are relevant for them. And this personal task table needs to be filled with data from various different project task tables. And these sync processes all need to be configured somehow. This lead us to the design, where we store the configuration of the setup in yet another notion table.
As a result, we have the following components in the system:
Databases in Notion
- Master Databases … these are the “configuration UI” for Loki. This way, we only need to provide the IDs of these master databases to our app, and it can pull all additional config, without needing any custom user interfaces.
- Master User Database
… identifiers for users
… ID of the personal task database
… ID of personal time log database - Master Customer Database
… all customer information at one place - Master Project Database
… linked to Customer Database
… list of Team Members
… ID of the task Database of the project - Master Log Database
… Logs of various sync processes to get insights into the current status of Loki
- Master User Database
- Project XXX Task Database
… project specific task database used in project management, source where project teams manage the day to day work - Person XXX Task Database
… receives all tasks for the user from all project databases where this user is a member - Person XXX Time Log Database
… time logs of the user, linked to the task from the Person XXX Task Database
Within the external service Loki
- central SQL Database with multiple tables storing the synchronized data in a consistent, interlinked datamodel
- endpoints in a Next.js app for various synchronization workflows
- update project configuration
- update user configuration
- Backup Person Time Log
- Distribution of tasks from Project XXX Database to Person XXX Database
Scaling and handling potential issues
In theory, lots of these processes could fully leverage Notion automations to automatically sync updated or created records. But what happens, if Loki itself (or our hosting provider Vercel) is currently not available? In these cases we have to make sure to not miss any updates. This is why all of our sync processes follow a similar pattern:
- Check when the Sync was successful for the last time
- Get all data from Notion that changed after this last sync time
- Write this new data to the target system in order of their updated-at time.
This way, no matter where a process fails between those steps, you can simple start it again, to recover all data.
Workflows in a serverless world
To get up and running quicker, Loki is a Next.js app hosted on Vercel. This way, we have basically no effort for maintenance or deployment work. However, Vercel is a serverless service. You can not know if you have 1 instance that is currently running, or 100. However, this is actually a downside for us:
- Our jobs are limited by the Notion API rate limit. Running two separate processes in parallel slows each one down
- Getting the same data twice from Notion because the same job is running in parallel further slows us down because of the API rate limit.
That means, we have to make sure, to only ever run one instance of a sync job. We can do this very simply, by using the database as a locking mechanism. When a job is triggered, we first check if the job is already running, and if yes, place an object in a queue (a database table with waiting jobs). Whenever a job finishes, it checks the queue for pending jobs and triggers those.
And lastly, Vercel has a limit on request duration. To fix this, we can chop our jobs into smaller pieces, and trigger each piece with a new fetch request to the server itself. This way, we have more requests, but each one stays well below the duration limit.
While Notion offers very intuitive interfaces for data entry and maintenance, it isn’t perfect yet. Loki bridges this gap, giving us a flexible, company-wide solution. We’re excited to see where this takes us as Notion evolves.
What do you think?
so thats a short wrap up of our thoughts and approach but we would like to get your opinion on the solution!
- what are we missing?
- are we somehow wrong on certain points?
- how can we get more out of it?
- could it be integrated more deeply with the current capabilities of Notion?
Text me :) robert.kramer@esveo.com