SattaCell

SattaCell – Prediction Markets with LMSR
A centralized prediction market platform inspired by Polymarket, where users can trade shares on various outcomes using a fake token system. The platform uses the Logarithmic Market Scoring Rule (LMSR) as its Automated Market Maker (AMM) to automatically price shares and calculate probabilities based on market activity.
Context
The main goal of this project was to understand how Polymarket works — specifically the deep mathematical logic behind prediction markets. We chose to build it in Web2 rather than Web3 because we wanted more people to use it for fun, to simulate and understand how prediction markets work. Web3 would have limited our number of users with barriers like wallet setup, gas fees, and network complexity.
This project was created for CodeCell Hack 2025 as a learning experience and demonstration of prediction market mechanics.
What is a Prediction Market?
A prediction market is a platform where users can buy and sell shares in the outcome of future events. The price of shares represents the market's collective probability estimate of that outcome occurring. For example, if shares for "Team A wins" are trading at 70%, the market believes there's a 70% chance Team A will win.
The Mathematics Behind LMSR
The Logarithmic Market Scoring Rule (LMSR) is a mathematical mechanism for pricing shares in prediction markets. It was developed by Robin Hanson and ensures liquidity and fair pricing.
Weight Calculation
For each outcome i, the weight is calculated as:
weight_i = exp(q_i / b)
Where:
- q_i = Current state vector (quantity of shares) for outcome i
- b = Liquidity parameter (controls price sensitivity, default: 100)
- exp() = Exponential function
This weight determines how "heavy" each outcome is in the market based on current share quantities.
Probability Calculation
The probability of outcome i is:
P_i = weight_i / Σ(weight_j)
This normalizes weights so probabilities sum to 1, giving us the market's collective probability estimate for each outcome.
Cost Function
The total cost of the current market state is:
C(q) = b × ln(Σ(exp(q_i / b)))
Where:
- C(q) = Total cost function
- b = Liquidity parameter
- ln() = Natural logarithm
- q = Vector of current share quantities for all outcomes
This creates a smooth pricing curve that ensures liquidity at all times.
Trade Cost
When a user buys or sells shares, the cost is:
cost = C(q_after) - C(q_before)
Where:
-
q_before = Market state before the trade
-
q_after = Market state after the trade
-
Positive cost = User pays tokens (buying shares)
-
Negative cost = User receives tokens (selling shares)
Why LMSR/AMM Instead of Orderbook?
Prediction markets can use two main pricing mechanisms: orderbook matching (like Kalshi) or Automated Market Makers (AMM) like LMSR (like Polymarket). We chose LMSR for several reasons:
Orderbook Method (Kalshi):
- Requires buyers and sellers to match orders
- Can suffer from low liquidity (no trades if no counterparty)
- More complex to implement (order matching, partial fills, etc.)
- Prices only update when orders are matched
LMSR/AMM Method (Our Choice):
- Always Liquid: There's always a price available, even if you're the only trader
- Automatic Pricing: Prices calculated mathematically, no need to match orders
- Simpler Implementation: Single formula determines all prices
- Continuous Price Discovery: Probabilities update instantly with every trade
- Better for Low-Liquidity Markets: Works even with few traders
For a hackathon with limited time and users, LMSR was the perfect choice. It ensured our platform was always tradeable, even during low-traffic periods.
Simulated Example
Let's walk through a real example with actual numbers:
Initial State:
- Market: "Will it rain tomorrow?" with outcomes ["Yes", "No"]
- Initial state: q = [0, 0] (no shares outstanding)
- Liquidity parameter: b = 100
- Initial probabilities: 50% / 50% (equal)
Trade 1: Buying 100 shares of "Yes"
Before Trade:
- C([0, 0]) = 100 × ln(exp(0/100) + exp(0/100)) = 100 × ln(2) = 69.31 tokens
After Trade:
- q = [100, 0] (added 100 shares to "Yes")
- C([100, 0]) = 100 × ln(exp(100/100) + exp(0/100)) = 100 × ln(3.718) = 131.5 tokens
Cost: 131.5 - 69.31 = 62.19 tokens
New Probabilities:
- weight_yes = exp(100/100) = 2.718
- weight_no = exp(0/100) = 1.0
- P_yes = 2.718 / 3.718 = 73.1%
- P_no = 1.0 / 3.718 = 26.9%
Trade 2: Buying 100 shares of "No" (after Trade 1)
Before Trade:
- C([100, 0]) = 131.5 tokens
After Trade:
- q = [100, 100]
- C([100, 100]) = 100 × ln(exp(1) + exp(1)) = 100 × ln(5.436) = 169.3 tokens
Cost: 169.3 - 131.5 = 37.8 tokens
New Probabilities:
- Both outcomes now have equal shares, so probabilities return to 50% / 50%
Key Insights:
- The first 100 "Yes" shares cost 62.19 tokens (higher price due to moving from 50% to 73.1%)
- The first 100 "No" shares cost only 37.8 tokens (cheaper because "No" was at 26.9%)
- This demonstrates price impact or slippage - larger orders move prices more
- When both outcomes have equal shares, probabilities return to equilibrium (50/50)
Implementation Details
The platform calculates actual LMSR costs in real-time, not simple estimates. The trade flow is atomic and safe:
- Fetch current market state (q vector)
- Calculate LMSR cost BEFORE trade
- Apply delta to selected outcome
- Calculate LMSR cost AFTER trade
- Deduct/add tokens from user balance
- Update outcome shares in market state
- Record trade in database
- Return updated probabilities
All steps are wrapped in a MongoDB transaction to ensure atomicity.
Additional Features
We also built a quadratic voting system for hackathon participants, allowing them to vote on market outcomes using a fair, democratic mechanism where voting power scales quadratically with tokens spent.
For more detailed information about the implementation, tech stack, API documentation, and system architecture, check the GitHub README.
Lessons & Takeaways
- Understanding Complex Systems: Building LMSR from scratch taught us the mathematical foundations of prediction markets
- Web2 for Accessibility: Choosing Web2 allowed us to focus on the core mechanics without blockchain complexity
- Real-time Analytics: Building 10+ interactive charts helped visualize market dynamics
- Atomic Transactions: Implementing MongoDB transactions ensured data consistency
- User Experience: Balancing mathematical accuracy with intuitive UI was key to adoption
- Scaling Under Load: With 3k+ trades in just 5 days, we had to implement MongoDB change streams to handle real-time data updates efficiently
- Real-time Infrastructure: Implemented Socket.io for live updates across the platform - chart movements, leaderboard updates, trade history, and probability changes all update in real-time without page refreshes
- Performance Optimization: The high trade volume forced us to optimize database queries and implement efficient change stream listeners to keep the UI responsive
"Predicting the future, one trade at a time."
Made with ❤️ for CodeCell Hack 2025
Team
Built with amazing collaborators:
- Amandeep Singh Rathod – Co-developer
- Amrit Nigam – Co-developer
- Aditya Belgaonkar – Co-developer
- Omik Acharya – Co-developer