Scalable and customizable GTV reporting for Fairmart customers
A Painful Requirement We Tried to Avoid
Retail businesses in Singapore that operate in malls are typically subject to reporting their transaction volume to their landlords[^1]; One of the quieter requirements for running a retail business in Singapore is reporting. If you’re a tenant in a shopping mall, the mall operator typically wants your Point of Sale data submitted in their specific format, on their schedule, delivered to their system. Every mall has a different spec — different columns, different date formats, different payment type codes, different upload endpoints.
For Fairmart, this was a real problem. Our customers are retailers, and the ones we most wanted to serve — the ones operating at scale — were almost always in malls. But building a custom integration for every mall operator looked like an open-ended engineering commitment. So we kept deprioritizing it.
Why It Kept Getting Pushed
The engineering calculus was simple: a one-off integration for a single mall is straightforward. The anxiety was around the second, third, and tenth one. If each integration required meaningful custom work and ongoing maintenance, the cost would compound. We didn’t have a clean answer for how to scale it, so we avoided starting.
This is a pattern I’ve seen at a lot of startups. The problem isn’t that the work is hard — it’s that the work feels unbounded, and bounded work always wins.
The Reframe: Build It AI-First
What changed our thinking was approaching it as an AI-first project from the start.
Rather than designing a generalized system that could handle any mall’s requirements, we let AI handle the customization layer. Each integration is simple by design — a short script that knows how to pull data from our Elastic database, transform it to one specific mall’s required format, and upload it via FTP. There’s no abstraction trying to cover every case. There’s just a clear, readable pipeline, written quickly with AI assistance, tuned to one customer’s exact spec.
The key insight: the marginal cost of the second integration is much lower than the first when AI is doing most of the writing. You’re not building a framework. You’re just doing it again.
How It Works
The pipeline has three stages:
1. Export — Pull the relevant POS transaction data from Fairmart’s Elastic database for the reporting period.
2. Convert — Transform the data to the mall operator’s required format. This is where the customization lives: column mappings, GST calculations, payment type codes, filename conventions, line endings — all of it specified per customer.
3. Upload — Deliver the converted file to the mall operator via FTP or SFTP, depending on what they accept.
The whole thing runs as a scheduled service on a small server, triggering automatically each night.
Reliability Without Over-Engineering
We made a deliberate choice to keep the individual scripts simple rather than building in extensive error handling. Instead, we have a separate weekly checker that scans all processed files and flags any gaps — dates where a file should exist but doesn’t. If anything is missing, it re-runs the upload automatically.
This “check and retry” approach has proven more robust in practice than trying to anticipate every failure mode upfront. Simple scripts are easy to debug and easy to update when a mall operator changes their spec. The audit loop provides the safety net.
What It Unlocked
Mall tenants are a critical segment for Fairmart’s growth. Compliance reporting isn’t a nice-to-have — for many customers, it’s a prerequisite for signing. Having a repeatable, low-overhead process for building these integrations changed our answer from “we’ll get to it eventually” to “we can have this ready for onboarding.”
The first integration took the most work. Each one since has been faster. That’s the shape of the thing we were actually building — not a single script, but a pattern we can run again.
Stack: Python, Node.js, Elasticsearch, SFTP/FTP
[^1] This is common globally to varrying degrees