Exploits as a Feature? What zkSync and KiloEx Can Teach Us About Protocol Risk

Introduction
Another day, another exploit. That’s how it feels in crypto sometimes. But some incidents deserve more than a shrug, they expose design flaws that could hurt entire ecosystems if ignored. That’s what happened with KiloEx and several projects on zkSync. These weren’t just smart contract bugs. They were warning signs. About how fast-moving chains can sacrifice safety for speed. About what happens when developers chase TVL without battle-testing their stack. And for contributors in the Mitosis ecosystem, they’re a wake-up call. Because modular ecosystems like ours offer speed, flexibility, and permissionless innovation but that also means security becomes a shared responsibility.
In this piece: ➙ We’ll break down what went wrong on zkSync and KiloEx ➙ Zoom out to the ecosystem-level problems ➙ And walk through what Mitosis builders can do to make security part of the culture — not just a checkbox Let’s get into it.
What Really Happened on zkSync and KiloEx?
In early 2024, KiloEx, a DEX built on zkSync Era, got exploited. The attacker manipulated price feeds, created fake arbitrage opportunities, and drained ~$80K. That’s not massive in dollar terms but it was part of a pattern. Other zkSync projects, like forks of Merlin and SyncSwap, were also getting hit: ➙ Unverified contracts ➙ Sloppy upgradability ➙ Oracle issues
➙ Inconsistent audit standards The common thread? A rush to ship without solid infra.
Example: The KiloEx Oracle Mistake Instead of using a reliable oracle like Chainlink, KiloEx built its own. That gave attackers room to: → Manipulate pool balances →Skew prices → Drain the protocol without raising alarms The exploit didn’t need sophisticated tech — just poor assumptions.
Lesson: If your ecosystem isn’t providing secure defaults, your developers are one bad decision away from disaster.
This Isn’t Just About Code, It’s About Ecosystem Culture
Let’s be real: fast-growing chains attract copy-paste builders. zkSync was booming. TVL was pumping. Everyone wanted to be first. But that kind of gold rush creates a dangerous culture: →Fork and deploy →Cut corners →Pray nothing breaks And it’s not just zkSync. It could happen to any modular chain, including Mitosis, if we’re not careful. a. L2 ≠ Automatic Security One of the biggest misconceptions: “We’re on an L2, so we inherit Ethereum’s security.” That’s not how it works. Sure, zkSync settles to Ethereum. But your DEX smart contract? That’s still your problem. If your code has a bug, Ethereum won’t save you.
Mitosis takeaway: > Security isn’t modular by default. It has to be designed into every layer — from SDKs to rollup logic to contract templates.
b. When the Culture Prioritizes Speed, Mistakes Multiply zkSync’s ecosystem was young. Tooling was fragmented. Audit practices were inconsistent. And teams were under pressure to ship fast. There’s a lesson here: You can’t rely on every contributor to be a security expert. But you can give them tools and incentives to act like one.
How Mitosis Can Build a Security-First Ecosystem
Modular chains give devs a blank canvas. That’s the magic — and the risk. So the real question for Mitosis is: > How do we let contributors move fast without breaking everything?
Here are 3 practical approaches: a. Secure Dev Tooling — Not Optional Mitosis needs to ship secure-by-default templates. Think: ➢Pre-audited rollup modules ➢Standardized oracle and upgrade patterns ➢Built-in test harnesses and fuzzers If you don’t give devs a safe foundation, you’re asking for exploits.
Example: Optimism’s Bedrock Stack Optimism rebuilt its entire codebase (Bedrock) to simplify dev work and harden its infrastructure. It reduced attack surface, made audits easier, and gave builders a reliable foundation.
Mitosis could take this further — by baking security tooling directly into the SDK.
b. Make Security a Shared Game Security culture starts with incentives. What if every new rollup deployed on Mitosis had to: ➢Run a bug bounty (even a small one)? ➢Open source their code before mainnet? ➢Get a peer audit from another contributor? It’s not about being perfect. It’s about being accountable.
Example: Immunefi + Initia Initia launched its pre-TGE bug bounty with Immunefi — giving whitehats a shot to break things before real money entered the system. This didn’t just find bugs, it created trust in the project.
c. Real-Time Monitoring — Because Delays Are Fatal Most hacks aren’t stopped in real time. But they could be detected faster with better tooling. Mitosis could offer: ➢On-chain monitors for TVL changes ➢Price feed deviation alerts ➢Real-time dashboards for multisig signers This isn’t “nice to have” — it’s the difference between a contained issue and a multimillion-dollar exploit.
Conclusion: Security Is a Culture, Not a Checklist
The KiloEx and zkSync exploits didn’t just show us what went wrong — they showed us how quickly it can go wrong when speed beats discipline. In modular systems, every new contributor is a potential point of failure. But they’re also a potential line of defense — if the ecosystem gives them the tools, templates, and mindset to build responsibly. So here’s the challenge for Mitosis: ➢Create an ecosystem where security is assumed — not optional ➢Incentivize best practices, not just fast launches ➢Build a contributor culture where prevention is the flex, not patching post-hack Because the real alpha? It’s building something that doesn’t break when the next bull market stampede begins.
Comments ()