<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[0xoptimusprime]]></title><description><![CDATA[0xoptimusprime]]></description><link>https://www.0xoptimusprime.com</link><generator>RSS for Node</generator><lastBuildDate>Sun, 17 May 2026 11:59:16 GMT</lastBuildDate><atom:link href="https://www.0xoptimusprime.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Analyst’s Guide: Querying Morpho BlueVaults on Idle Liquidity]]></title><description><![CDATA[Over the past three weeks, I’ve been working on an analysis of the Level Finance protocol, like the image you see above.
And it’s dawned on me that querying lending and borrowing markets on Morpho via Dune can be daunting, a little frustrating, and a...]]></description><link>https://www.0xoptimusprime.com/analysts-guide-querying-morpho-bluevaults-on-idle-liquidity</link><guid isPermaLink="true">https://www.0xoptimusprime.com/analysts-guide-querying-morpho-bluevaults-on-idle-liquidity</guid><category><![CDATA[Blockchain]]></category><category><![CDATA[Dune Analytics]]></category><category><![CDATA[blockchain data]]></category><dc:creator><![CDATA[Olusegun Aborode]]></dc:creator><pubDate>Thu, 09 Oct 2025 23:00:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1765995431784/c90ebcca-6d6b-42b0-bf17-4e295ab2e817.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Over the past three weeks, I’ve been working on an analysis of the Level Finance protocol, like the image you see above.</p>
<p>And it’s dawned on me that querying lending and borrowing markets on Morpho via Dune can be daunting, a little frustrating, and also quite exciting.</p>
<p>The Analysis on the LevelUSD protocol was to look at the utilisation of lvlUSD and slvlUSD alongside other insights across Defi use cases like Pendle, Morpho, Curve, and Spectra.</p>
<blockquote>
<p><em>You can find the complete dashboard here:</em> <a target="_blank" href="https://dune.com/0nchainlabs/level-lvlusd-dashboard"><em>https://dune.com/0nchainlabs/level-lvlusd-dashboard</em></a></p>
</blockquote>
<p>While working on the utilisation of lvlUSD and slvlUSD on Morpho, my first scope was to analyse the lvlUSD and slvlUSD associated vaults: <strong>M11C Level lvUSD vault and Steakhouse Level USDC vault</strong>, and then query the Total Value Locked (TVL) or Total Deposits in both vaults.</p>
<p>Morpho, for those unfamiliar, is a permissionless lending protocol on Ethereum that allows curators to create optimized vaults for lending assets like lvUSD (a synthetic stablecoin from Level Finance) and USDC.</p>
<p>So I began with a common approach, <strong>calculating net flows from deposit and withdrawal events,</strong> but ran into issues when the figures didn’t match the vault’s displayed stats on the Morpho app on the Steakhouse Vault.</p>
<p>It was strange, the same query was working great with near-matched results on M11C Vaults; however, the Steakhouse vault was showing something I’d never seen before, which is <strong>Idle Liquidity</strong>.</p>
<p>Since these are new waters for me, I started reading up on them, and Andrew Wong’s resources were invaluable in understanding what this means, as well as the right logs to decode and query to achieve my desired result.</p>
<p>And in this article, I’ll walk you through both methods I used, how I applied them, and I’ll include exact SQL queries I used on Dune Analytics and the data outputs from my runs at the time.</p>
<h3 id="heading-lets-understand-what-idle-liquidity-means-from-a-data-pov"><strong>Let’s Understand What Idle Liquidity Means from a Data POV</strong></h3>
<p>Press enter or click to view the image in full size</p>
<p><img src="https://miro.medium.com/v2/resize:fit:1400/1*0Mu7zfAYnT-tYaWx41adPw.png" alt /></p>
<p>In the context of Morpho vaults (like the USDC shown in the screenshot, which appears to be on Steakhouse Vault on Morpho Blue), “Idle Liquidity” refers to the portion of the vault’s total deposited assets that is not currently allocated or lent out to any active markets or external protocols.</p>
<p>Instead, these assets are held directly in the main vault contract as a readily available buffer. This ensures quick access for user withdrawals and provides flexibility for the vault’s curator or allocator to deploy capital strategically without immediate risk exposure. Why is this important?</p>
<ul>
<li><p><strong>Separation of Assets</strong>: Morpho Vaults v2 explicitly distinguishes between idle liquidity (unallocated capital) and allocated capital (assets supplied to yield-generating markets via adapters).<br />  All user deposits start as idle liquidity in the vault contract. The vault’s allocator can then move portions of it to specific markets (e.g., the PT-lvlUSD-25SEP2025/USDC in the screenshot, which has a 3.07% allocation).</p>
</li>
<li><p><strong>Conservative risk management</strong>: The vault’s curator sets caps (absolute or relative) on allocations to limit exposure to certain collaterals or markets, leaving more assets idle to avoid volatility or potential bad debt.</p>
</li>
<li><p><strong>Liquidity prioritization</strong>: To handle potential withdrawals without needing to pull funds from illiquid positions.</p>
</li>
<li><p><strong>Market conditions</strong>: If yields in available markets are low or risky, the allocator might hold back deployment.</p>
</li>
</ul>
<p>In USDC Markets specifically, here, USDC is the loan token (the asset being lent). Idle liquidity in these markets represents USDC deposits not yet supplied to borrowers against collaterals like PT-lvlUSD, IvUSD, or slvUSD.</p>
<p>Which means it’s essentially “parked” USDC, earning no yield, contrasting with allocated portions that generate interest from borrowers.</p>
<p>No doubt this is really normal with Defi Lending and Borrowing. But from a Data POV, it completely changes how you approach analysing this vault because this changes the whole outlook of your results. Here’s what I mean.</p>
<h2 id="heading-the-setup-my-querying-experience"><strong>The Setup: My Querying Experience</strong></h2>
<p>I was analyzing the <a target="_blank" href="https://app.morpho.org/ethereum/vault/0x2C3Cc1C02856894345797Cf6ee76aE76AC0f4031/m11c-level-lvlusd">M11C Level lvUSD vault</a> and the <a target="_blank" href="https://app.morpho.org/ethereum/vault/0xbEEf11C63d7173BdCC2037e7220eE9Bd0cCDA862/steakhouse-level-usdc">Steakhouse Level USDC vault</a>. Where, at the time I was writing this, the Morpho app showed TVL values of about $395.50k for M11C and $20.43k for Steakhouse.</p>
<p>These are ERC-4626-compatible vaults, meaning they emit standard events for deposits, withdrawals, and state updates, making them queryable via Ethereum logs.</p>
<p><strong>I started with Method 1</strong>: Summing all historical deposits minus withdrawals to reconstruct the TVL. This seemed logical, like tallying a bank ledger.</p>
<p>I wrote a SQL query to pull Deposit and Withdraw events, extract the asset amounts (scaled by 1e6 for USDC/lvUSD decimals), and compute the net.</p>
<p>Here’s the exact query I ran for the Steakhouse vault:</p>
<p><img src="https://miro.medium.com/v2/resize:fit:1400/1*7kGMKKF_AFyth97pBZR3OQ.png" alt /></p>
<p>But wait, these results didn’t match the Morpho app’s $20.43k!</p>
<p>Press enter or click to view the image in full size</p>
<p><img src="https://miro.medium.com/v2/resize:fit:1400/1*ruZYPW94zvdY3H-KsvmHOg.png" alt /></p>
<p>Snapshot from a Morpho Steakhouse Vault</p>
<p>Was the query wrong? No. What about the logic? Also No. So why? Why is this result negative?</p>
<p>Well vaults like these accrue interest, take fees, and rebalance assets internally (e.g., lending out to Morpho markets), which aren’t captured in simple deposit/withdraw events.</p>
<p>Also, remember the Idle Liquidity? Yeah, at this point, that liquidity is in the vault, but there’s been no deposit/withdrawal events to account for it because it is Idle.</p>
<p>Hence, the query and logic of method one would work. Thanks to some resources (actually, Andrew Wong's resources), I noticed recurring UpdateLastTotalAssets events in the log on all transaction receipts.</p>
<p>And found out that these events are emitted whenever the vault’s total assets change, not just from user actions but also from internal updates like interest accrual or reallocations.</p>
<p>This event directly logs the absolute current total, like a bank’s official balance statement. Switching to this (<a target="_blank" href="https://dune.com/queries/5624347/9142798">Method Two</a>) fixed everything.</p>
<p><img src="https://miro.medium.com/v2/resize:fit:1400/1*-JwWrGA32zyRIG-7_t1mIA.png" alt /></p>
<p>All I did was leave the deposit/withdrawal event. Decode the UpdateLastTotalAssets events instead (as shown below). And decoding the data column using varbinary_to_uint256 to find the total_deposits. Doing this is simply you querying the state snapshot.</p>
<p><img src="https://miro.medium.com/v2/resize:fit:1400/1*zOfyF64cxXdVNcmD_hsxfw.png" alt class="image--center mx-auto" /></p>
<h3 id="heading-the-two-methods-net-flows-vs-state-snapshot"><strong>The Two Methods: Net Flows vs. State Snapshot</strong></h3>
<p>There are two primary ways to query TVL in a Morpho vault using Ethereum logs, as I have shown above:</p>
<ol>
<li><strong>Method 1: Net Flows (Sum Deposits Minus Withdrawals)</strong></li>
</ol>
<ul>
<li><p><strong>How to Do It</strong>: Aggregate all Deposit and Withdraw events from the vault’s inception. Extract the assets field from the data blob (first 32 bytes), scale by decimals (1e6 for USDC/lvUSD in this case), and compute the net.</p>
</li>
<li><p><strong>Differences from Method 2</strong>: This reconstructs TVL historically, treating the vault like a flow-based ledger. It ignores internal adjustments (e.g., interest from lending in Morpho markets or curator fees).</p>
</li>
</ul>
<ol>
<li><strong>Method 2: State Snapshot (Latest UpdateLastTotalAssets Event)</strong></li>
</ol>
<ul>
<li><p><strong>How to Do It</strong>: Query the most recent UpdateLastTotalAssets event. The data field is the raw totalAssets uint256, divided by 1e6 for a readable value.</p>
</li>
<li><p><strong>Differences from Method 1</strong>: This uses the vault’s internal state update, which is emitted on every change (user actions, accruals, rebalances). It’s an absolute value, not a reconstruction.</p>
</li>
</ul>
<h3 id="heading-which-is-better-pros-and-cons"><strong>Which is Better? Pros and Cons</strong></h3>
<p>In my experience, at least from this experience, which has cost me literally the whole day. <strong>Method 2 (State Snapshot) is far superior for current TVL queries</strong>.</p>
<p>It’s what I will now use exclusively for dashboards and monitoring. Here’s why:</p>
<p><strong>Pros of Method 1 (Net Flows)</strong>:</p>
<ul>
<li><p>Great for historical analysis or auditing flows over time (e.g., track user behavior or net inflows monthly).</p>
</li>
<li><p>Transparent: You see every transaction contributing to the total.</p>
</li>
<li><p>Useful for understanding patterns, like in my 30-day activity query, where I saw deposits of ~$52.8k outweighing withdrawals.</p>
</li>
<li><p>Also very great for seeing granular data,</p>
</li>
</ul>
<p><strong>Cons of Method 1</strong>:</p>
<ul>
<li><p>Error-prone: Misses non-user changes like interest (via AccrueInterest events) or reallocations, leading to mismatches (e.g., my negative TVL issue).</p>
</li>
<li><p>Computationally heavy: Summing thousands of events can hit query limits on platforms like Dune.</p>
</li>
<li><p>Inaccurate for live monitoring: Vault balances “suddenly” change without new deposits/withdrawals, making the figures wrong quickly.</p>
</li>
</ul>
<p><strong>Pros of Method 2 (State Snapshot)</strong>:</p>
<ul>
<li><p>Reliable and accurate: It’s the vault’s own accounting — includes everything (deposits, withdrawals, interest, fees, rebalances).</p>
</li>
<li><p>Efficient: Just one event to query, fast for real-time use.</p>
</li>
<li><p>Matches official sources: Aligned perfectly with the Morpho app stats in my tests.</p>
</li>
</ul>
<p><strong>Cons of Method 2</strong>:</p>
<ul>
<li><p>Less granular: Doesn’t show historical breakdowns without querying multiple events.</p>
</li>
<li><p>Relies on event emission: If the vault stops updating (rare, but possible in bugs), it’s stale, although Morpho vaults are updated regularly.</p>
</li>
</ul>
<p>Overall, Method 2 is better for 90% of use cases, especially current TVL, where you need to also account for vaults with Idle Liquidity, which you could miss using Method 1.</p>
<p>Still, Method 1 shines for forensics or trends, but requires supplementing with AccrueInterest to account for yields, which is also very important</p>
<p><a target="_blank" href="https://medium.com/tag/blockchain?source=post_page-----e40346e1865b---------------------------------------">  
</a><a target="_blank" href="https://medium.com/plans?source=upgrade_membership---post_li_non_moc_upsell--e40346e1865b---------------------------------------">  
</a></p>
]]></content:encoded></item><item><title><![CDATA[Level Protocol: A Comprehensive Data Analysis]]></title><description><![CDATA[@levelusd launched lvlUSD as a fully collateralized stablecoin backed by USDC and USDT, with reserves strategically deployed into established lending protocols like Aave and Morpho.
Level Protocol's fundamental premise was compelling: to create a sta...]]></description><link>https://www.0xoptimusprime.com/level-protocol-a-comprehensive-data-analysis</link><guid isPermaLink="true">https://www.0xoptimusprime.com/level-protocol-a-comprehensive-data-analysis</guid><category><![CDATA[Blockchain]]></category><category><![CDATA[Blockchain technology]]></category><category><![CDATA[Dune Analytics]]></category><dc:creator><![CDATA[Olusegun Aborode]]></dc:creator><pubDate>Wed, 08 Oct 2025 23:00:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766011261645/ec2804f0-dcce-4270-94dc-5e54ba1091a8.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><a target="_blank" href="https://x.com/@levelusd">@levelusd</a> <a target="_blank" href="https://x.com/@levelusd">launch</a>ed <a target="_blank" href="https://x.com/@levelusd">lvlUSD a</a>s a fully collateralized stablecoin backed by USDC and USDT, with reserves strategically deployed into established lending protocols like Aave and Morpho.</p>
<p>Level Pro<a target="_blank" href="https://x.com/@levelusd">tocol's f</a>undamental premise was compelling: to create a stablecoin that generates a sustainable yield from real economic activity, rather than relying on token emissions. Users could stake lvlUSD into slvlUSD to capture returns from borrowing demand across integrated lending markets.</p>
<p>The proto<a target="_blank" href="https://x.com/@levelusd">col's des</a>ign centers on delivering sustainable, low-risk yield from blue-chip DeFi lending protocols. Users can hold lvlUSD as a stable asset or stake it into slvlUSD to earn automated yield while maintaining composability across DeFi applications. And the same time, $2.4 million in cumulative yield has been distributed.</p>
<p>As of Oct<a target="_blank" href="https://x.com/@levelusd">ober 2025</a>, Level Protocol had a total lvlUSD supply of $3.37 million with slvlUSD at $1.95 million, representing a combined $5.32 million in circulating tokens. The staking ratio sits at 0.5444, indicating that 54% of lvlUSD holders have chosen to stake their tokens into the yield-bearing slvlUSD variant.</p>
<p><img src="https://pbs.twimg.com/media/G2xDAykXkAAwK7X?format=png&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>The weekly supply chart traces the protocol's trajectory from its October 2024 launch through October 2025. Starting from near-zero, the supply peaked in May 2025 at approximately $184 million before declining to current levels.</p>
<p>This analysis examines on-chain data across Level Protocol's primary liquidity venues, Pendle, Curve, Morpho, and Spectra, tracking adoption patterns, liquidity distribution, and usage trends from launch through October 2025.</p>
<blockquote>
<p><strong>Access the Dashboard here:</strong> <a target="_blank" href="https://dune.com/0nchainlabs/level-lvlusd-dashboard"><strong>https://dune.com/0nchainlabs/level-lvlusd-dashboard</strong></a></p>
</blockquote>
<h3 id="heading-lvlusd-amp-slvlusd-utilisation-and-distribution"><strong>lvlUSD &amp; slvlUSD Utilisation and Distribution</strong></h3>
<p>Examining the distribution across all venues reveals how Level Protocol's liquidity was allocated throughout the DeFi ecosystem.</p>
<p>At peak capacity in May-July 2025, Pendle held approximately $30 million, Curve held roughly $16 million across both pools, Morpho reached $10 million, and Spectra achieved $20,000. This accounts for approximately $46 million across the four tracked venues.</p>
<p>With a peak total supply of $184 million, the tracked venues represent roughly 26% of total protocol supply. The remaining 74% was probably distributed across direct holdings, other DeFi integrations, wallets, or applications not captured in this analysis.</p>
<h3 id="heading-pendle-principal-yield-tokenization"><strong>Pendle: Principal Yield Tokenization</strong></h3>
<p><img src="https://pbs.twimg.com/media/G2xDIviXYAASorw?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>Pendle tokenizes yield-bearing assets, allowing users to split assets into ownership and yield components for separate trading. Level Protocol achieved significant traction on this platform, making it one of the protocol's largest liquidity venues.</p>
<p>The platform showed near-zero activity in April 2025, growing rapidly to approximately $30 million in total value locked by early July 2025. This $36 million peak represented the protocol's strongest single-venue performance across the entire ecosystem.</p>
<blockquote>
<p><strong>Current TVL stands at $816,704 as of October 2025.</strong></p>
</blockquote>
<p>The composition of assets on Pendle evolved notably over time. During the growth phase, slvlUSD dominated liquidity pools, reaching roughly $29 million, while lvlUSD held around $8 million.</p>
<p>By September 2025, this distribution shifted toward approximate parity.</p>
<h3 id="heading-curve-stablecoin-exchange-dynamics"><strong>Curve: Stablecoin Exchange Dynamics</strong></h3>
<p><img src="https://pbs.twimg.com/media/G2xDUEuWAAA67Ai?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>Curve Finance operates as a decentralized exchange optimized for stablecoin swaps with minimal slippage. Level Protocol deployed two distinct pool types on Curve: lvlUSD/USDC and slvlUSD/lvlUSD, each serving different user preferences.</p>
<p>The lvlUSD/USDC pool currently holds $492,225 in TVL, while the slvlUSD/lvlUSD pool holds $185,538. This creates a 2.65x size ratio favoring the USDC-paired pool.</p>
<p>However, both pools grew steadily from April 2024 through July 2025, with the monthly TVL chart showing a consistent upward trajectory during this period.</p>
<p>The larger size of the lvlUSD/USDC pool probably demonstrates user preference for trading against established stablecoins rather than swapping between lvlUSD variants.</p>
<h3 id="heading-morpho-lending-market-performance"><strong>Morpho: Lending Market Performance</strong></h3>
<p><img src="https://pbs.twimg.com/media/G2xDbJwXoAAC9hu?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>Level Protocol deployed two lending markets on Morpho: the M11C vault and the Steakhouse vault, each attracting different levels of user engagement.</p>
<p>The M11C vault currently holds $168,779 in TVL, representing 95% of total Morpho lvlUSD activity. The Steakhouse vault holds $1,852, accounting for the remaining 5%. This 20x difference in size indicates strong user preference for the M11C vault, likely due to earlier launch timing, greater visibility, or more attractive yield parameters, or it could be any factor really.</p>
<ul>
<li><p>The M11C vault's activity timeline shows distinctive patterns. The vault experienced rapid growth to approximately $2 million in early June 2025. This was followed by significant volatility through July, with TVL declining to near-zero before partially recovering to current levels.</p>
</li>
<li><p>The Steakhouse vault experienced rapid growth to approximately $14 million in early June, launching after M11C, and has since flattened out to current levels.</p>
</li>
</ul>
<p>Morpho's lending markets captured meaningful protocol usage, with the Steakhouse vault achieving eight-figure TVL during its peak performance period.</p>
<h3 id="heading-spectra-enhanced-yield-pools"><strong>Spectra: Enhanced Yield Pools</strong></h3>
<p><img src="https://pbs.twimg.com/media/G2xDgMkW8AEQrs3?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>Level Protocol's presence on Spectra represents the smallest of the four major venues analyzed, but provides insights into user behavior around higher-yield opportunities.</p>
<p>Current TVL on Spectra stands at $1,439. The pool's history shows it launched around July 20, 2025, experiencing rapid initial growth to approximately $20,000 by July 27, a 14x increase in the first week of operation.</p>
<blockquote>
<p><strong>This represented the fastest relative growth rate across any Level Protocol venue.</strong></p>
</blockquote>
<p>Interestingly, the pool's composition has consistently favored lvlUSD, which comprised roughly 60-70% of the pool during its peak and continues to dominate the current distribution.</p>
<h3 id="heading-level-protocol-data-observations"><strong>Level Protocol Data Observations</strong></h3>
<p>The Level Protocol Dashboard was something that I had been working on for some months at <a target="_blank" href="https://x.com/@0nchainlab">@0nchainlab</a> to capture the complete cycle of Level Protocols stablecoin and its utilisation. The subsequent months through October 2025 show the protocol operating at reduced but stable levels around $5 million in combined Supply.</p>
<p>Maybe that's due to the announcement that the Level team is getting acquired by a leading DeFi protocol and the team would be joining them, thereby sunsetting both the lvlUSD and slvlUSD.</p>
<p><a target="_blank" href="https://x.com/0xOptimusPrime/status/1976315186534285436">  
</a></p>
]]></content:encoded></item><item><title><![CDATA[Unlocking Account Abstraction with Amex Passport]]></title><description><![CDATA[When American Express launched Amex Passport in September 2025, minting NFT “stamps” on Base for international purchases, I knew the on-chain data would tell a story most people would miss.
Users have no idea they’re interacting with Web3. That invis...]]></description><link>https://www.0xoptimusprime.com/unlocking-account-abstraction-with-amex-passport</link><guid isPermaLink="true">https://www.0xoptimusprime.com/unlocking-account-abstraction-with-amex-passport</guid><category><![CDATA[Blockchain]]></category><category><![CDATA[Data Science]]></category><category><![CDATA[blockchain data]]></category><dc:creator><![CDATA[Olusegun Aborode]]></dc:creator><pubDate>Thu, 25 Sep 2025 23:00:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1765992243148/2b391d3b-f4d4-4efb-9dbb-a3e851489cdd.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When American Express launched Amex Passport in September 2025, minting NFT “stamps” on Base for international purchases, I knew the on-chain data would tell a story most people would miss.</p>
<p>Users have no idea they’re interacting with Web3. That invisibility is the product’s genius, but it also means the blockchain holds insights Amex isn’t publishing.</p>
<p>I built a <a target="_blank" href="https://dune.com/optimus_prime1/amex-passport-analysis">Dune dashboard</a> to decode it. Here’s how.</p>
<h4 id="heading-the-architecture-worth-understanding">The Architecture Worth Understanding</h4>
<p>Before writing queries, you need to grasp what Amex built:</p>
<ul>
<li><p><strong>Account Abstraction (ERC-4337)</strong> gives each user a smart contract wallet instead of a standard EOA. Amex’s backend controls minting, no seed phrases, no user action required.</p>
</li>
<li><p><strong>Paymaster contracts</strong> cover all gas fees. Users pay nothing. Amex funds a contract that validates legitimate mints and reimburses costs through the EntryPoint.</p>
</li>
<li><p><strong>Soulbound tokens</strong> have disabled transfers. Stamps can’t be sold,  purely commemorative. No speculation, no secondary market noise.</p>
</li>
</ul>
<h4 id="heading-the-contracts">The Contracts</h4>
<p>Three addresses matter:</p>
<ul>
<li><p><strong>NFT Contract</strong>: <code>0x96cebb59b00109dc8c4de1a9a94d9fad658fee46</code></p>
</li>
<li><p><strong>EntryPoint (v0.7.0)</strong>: <code>0x0000000071727de22e5e9d8baf0edac6f37da032</code></p>
</li>
<li><p><strong>Paymaster</strong>: <code>0x5fa66dfe8a3983e55071e8c4631ab43b5f33a4ab</code></p>
</li>
</ul>
<h3 id="heading-query-1-daily-mints-and-unique-wallets">Query 1: Daily Mints and Unique Wallets</h3>
<p>The Transfer event (<code>0xddf252ad...</code>) fires on every mint. Key filter: <code>topic1</code> equals the zero address, that's a mint, not a transfer.</p>
<pre><code class="lang-plaintext">SELECT
  varbinary_ltrim(topic2) AS minted_to,
  varbinary_to_uint256(topic3) AS token_id
FROM base.logs
WHERE contract_address = 0x96cebb59b00109dc8c4de1a9a94d9fad658fee46
  AND topic0 = 0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef
  AND topic1 = 0x0000000000000000000000000000000000000000000000000000000000000000
</code></pre>
<p><code>varbinary_ltrim()</code> strips left-padding from addresses. <code>varbinary_to_uint256()</code> converts the token ID. I join this with the <code>TokensMinted</code> event (<code>0x9d89e36e...</code>) to pull IPFS URIs when needed.</p>
<p>The full query aggregates daily mints, unique wallets, cumulative totals, and a 7-day moving average to smooth noise.</p>
<h3 id="heading-query-2-wallet-distribution">Query 2: Wallet Distribution</h3>
<p>Are stamps concentrated in whales or spread across users?</p>
<pre><code class="lang-plaintext">WITH wallet_mint_counts AS (
  SELECT
    varbinary_ltrim(topic2) AS wallet,
    COUNT(*) AS stamps_per_wallet
  FROM base.logs
  WHERE contract_address = 0x96cebb59b00109dc8c4de1a9a94d9fad658fee46
    AND topic0 = 0xddf252ad...
    AND topic1 = 0x0000...
  GROUP BY varbinary_ltrim(topic2)
)
SELECT
  stamps_per_wallet,
  COUNT(*) AS wallets_with_this_count
FROM wallet_mint_counts
GROUP BY stamps_per_wallet
</code></pre>
<p>I bucket wallets by stamp count and calculate the percentage of total wallets and stamps per bucket. The data shows remarkably even distribution, no whale dominance, indicating organic adoption across diverse cardholders.</p>
<h3 id="heading-query-3-gas-economics">Query 3: Gas Economics</h3>
<p>This reveals Amex’s actual infrastructure costs. The <code>UserOperationEvent</code> from EntryPoint contains everything:</p>
<pre><code class="lang-plaintext">SELECT
  varbinary_to_uint256(varbinary_substring(data, 1, 32)) AS nonce,
  varbinary_to_uint256(varbinary_substring(data, 33, 32)) &gt; 0 AS success,
  varbinary_to_uint256(varbinary_substring(data, 65, 32)) AS actual_gas_cost_wei,
  varbinary_to_uint256(varbinary_substring(data, 97, 32)) AS actual_gas_used
FROM base.logs
WHERE contract_address = 0x0000000071727de22e5e9d8baf0edac6f37da032
  AND topic0 = 0x49628fd1471006c1482da88028e9ce4dbb080b815c9b0344d39e5a8e6ec1419f
</code></pre>
<p>The <code>data</code> field packs non-indexed parameters sequentially in 32-byte chunks. I filter for operations that <code>paymaster_used</code> match Amex's Paymaster address, then join with it <code>prices.usd</code> to convert ETH costs to USD at the minute-level.</p>
<p>The final output: daily sponsored operations, gas costs in ETH and USD, cost-per-stamp, and cumulative spend over time.</p>
<h3 id="heading-technical-notes">Technical Notes</h3>
<ul>
<li><p><strong>Use DuneSQL’s varbinary functions</strong>: <code>varbinary_ltrim()</code>, <code>varbinary_to_uint256()</code>, and <code>varbinary_substring()</code> handle hex data cleanly. Don't fight with string casting.</p>
</li>
<li><p><strong>Always filter for mints</strong>: Transfer events fire for both mints and transfers. Without the zero-address filter, you’ll miscount.</p>
</li>
<li><p><strong>Verify byte offsets</strong>: The UserOperationEvent packs data at specific positions (1–32 for nonce, 33–64 for success, etc.). Get these wrong, and you’ll extract garbage. Cross-reference with the contract ABI on BaseScan; this helps too.</p>
</li>
<li><p><strong>Join prices at minute granularity</strong>: The <code>prices.usd</code> table on Dune provides minute-level pricing. Match on <code>date_trunc('minute', block_time)</code> for accurate USD conversion.</p>
</li>
</ul>
<h3 id="heading-what-the-data-shows">What the Data Shows</h3>
<p>The dashboard reveals what Amex won’t publish: real adoption curves, cost efficiency trends, and wallet concentration metrics. And to be honest, they dont need to. Public blockchains create accountability, even when the company abstracts the technology completely.</p>
<p><a target="_blank" href="https://dune.com/optimus_prime1/amex-passport-analysis"><strong>View the full dashboard →</strong></a></p>
]]></content:encoded></item><item><title><![CDATA[How Aptos Stablecoins Enable Cost-Effective Transactions]]></title><description><![CDATA[The demand for cost-effective payments is at an all-time high.
And stablecoins can be the bridge connecting these transactions across traditional and decentralized finance.
That’s why Aptos is well-positioned to support the growing stablecoin ecosyst...]]></description><link>https://www.0xoptimusprime.com/how-aptos-stablecoins-enable-cost-effective-transactions</link><guid isPermaLink="true">https://www.0xoptimusprime.com/how-aptos-stablecoins-enable-cost-effective-transactions</guid><category><![CDATA[Blockchain]]></category><category><![CDATA[Blockchain technology]]></category><category><![CDATA[Dune Analytics]]></category><category><![CDATA[blockchain data]]></category><dc:creator><![CDATA[Olusegun Aborode]]></dc:creator><pubDate>Mon, 16 Dec 2024 00:00:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766010801495/e1b1466e-dfdb-4969-b4a8-a737a179e06f.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The demand for cost-effective payments is at an all-time high.</p>
<p>And stablecoins can be the bridge connecting these transactions across traditional and decentralized finance.</p>
<p>That’s why Aptos is well-positioned to support the growing stablecoin ecosystem, with its capacity to scale high-volume cross-border transactions while processing over <a target="_blank" href="https://bixinventures.medium.com/aptos-is-the-frontier-of-high-performance-defi-6911b8fd0a60">160,000 transactions per second</a>. According to an <a target="_blank" href="https://medium.com/aptoslabs/block-stm-how-we-execute-over-160k-transactions-per-second-on-the-aptos-blockchain-3b003657e4ba">article</a> by <a target="_blank" href="https://x.com/@SashaSpiegelman">@SashaSpiegelman</a> and <a target="_blank" href="https://x.com/@rgelash">@rgelash</a>, <a target="_blank" href="https://x.com/@Aptos">@Aptos</a> achieves this high throughput using a multi-threaded, in-memory parallel execution engine that leverages Software Transactional Memory (STM).</p>
<p>TLDR: Transactions are faaast. These transactions confirm quickly and also cost mere fractions of a cent—roughly $0.00002 per transaction.</p>
<p>The edge Aptos brings is quite competitive for stablecoins that aim to beat the effectiveness of fiat alternatives on payment rails like Visa or Mastercard. And as <a target="_blank" href="https://x.com/@moshaikhs">@moshaikhs</a> <a target="_blank" href="https://x.com/@moshaikhs">noted on</a> Chain Reaction:</p>
<blockquote>
<p>There is a need for a new way to share information digitally, one that allows users to exchange both information and economic value more efficiently and fairly, Aptos.</p>
</blockquote>
<p>Apt<a target="_blank" href="https://x.com/@moshaikhs">os is the</a> new way, and the emphasis here is on how efficient the blockchain is across multiple use cases, especially payments.</p>
<p>Currently, Stablecoin's supply on Aptos is p<a target="_blank" href="https://x.com/@moshaikhs">eaking at</a> $729.8m, and the strong growth trend from January to now is easily visible. Thanks to the launch of both USDT and USDC back-<a target="_blank" href="https://x.com/@moshaikhs">to-back th</a>is quarter.</p>
<p><img src="https://pbs.twimg.com/media/Ge6V7AIXYAAiecx?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<h3 id="heading-stablecoins-are-fast-cheap-and-scalable-on-aptoshttpsxcomaptos"><strong>Stablecoins are fast, cheap, and scalable on</strong> <a target="_blank" href="https://x.com/@Aptos"><strong>@Aptos</strong></a><strong>.</strong></h3>
<p>With transactions and payments, the conversation between users or businesses usually revolves around F<strong>ees, Time to settle</strong>, <strong>and Convenience (ease of use).</strong></p>
<p>Solutions that offer a competitive advantage across these factors tend to have an uncanny edge in the market. Aptos offers that advantage; here's how:</p>
<h3 id="heading-fees"><strong>Fees</strong></h3>
<p>If Aptos is that fast, does it increasingly outperform other alternatives like MoneyGram, Western Union, Mastercard, etc.? Yes, it does.</p>
<p>Traditional rails, which you use daily, can take hours/days to settle cross-border transactions or charge 2–3% of each transaction, hitting small businesses especially hard.</p>
<p>So, if a coffee shop only nets a dollar of profit for every two-dollar latte sold, losing an extra 15 cents to payment fees significantly hurts margins.</p>
<p>Utilizing stablecoins via Aptos, by contrast, can lower these fees almost to zero.</p>
<p>According to the <a target="_blank" href="https://x.com/@ournetwork__">@ournetwork__</a> report, Aptos is the cheapest chain for USDT transfers, costing about $0.0002 in fees.</p>
<p><img src="https://pbs.twimg.com/media/Ge2Q26ZXoAA3asX?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>From a business perspective, having a steep drop in transaction fees could offer business owners more operational opportunities and better margins, amongst other perks.</p>
<h3 id="heading-settlement-period"><strong>Settlement Period</strong></h3>
<p>The settlement period for a transaction is another focus for users, especially with cross-border payments.</p>
<p>Services like MoneyGram and Western Union can <a target="_blank" href="https://corporate.moneygram.com/documents/PDF_Forms/TermsAndCond/Europe/Belgium/Send%20Terms%20English.pdf/">take up to 4 days</a> to finalize transactions outside the European Economic Area.</p>
<p>In contrast, Aptos achieves settlement in approximately 6.5 seconds, outside the European Economic Area, regardless of the location.</p>
<p><img src="https://pbs.twimg.com/media/Ge3K448WgAAxGBH?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>This 6.5-second block time ensures fast and efficient transaction confirmations, making Aptos a competitive solution for financial applications and real-time use cases.</p>
<p>The speed also impacts the ease of remittances, where you can send USDT (on Aptos) to someone back home, pay about $0.0002 in fees, and it arrives in about 6.5 seconds or less.</p>
<p>That’s a game-changer, not just for individuals, but also for small businesses in emerging markets in West Africa that suffer from slow wire transfers and high exchange markups.</p>
<p>Now, someone in the UK can send money back home to Senegal quickly and cheaply, 0.00002$, removing multiple intermediaries and the associated risk of payment failure.</p>
<p>An alternative for this same transaction is to use Ria Money Transfer. To send 100 GBP, there's almost 700 XOF lost in exchange <a target="_blank" href="https://www.oanda.com/currency-converter/en/?from=GBP&amp;to=XOF&amp;amount=100">compared to other rates</a>.</p>
<p><img src="https://pbs.twimg.com/media/Ge7qwDoWAAA9EQH?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>You'd also have to pay about 4 GBP in fees for this transaction to be processed within a couple of minutes or hours.</p>
<p><img src="https://pbs.twimg.com/media/Ge7qovIXEAAV2lD?format=png&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>When it comes down to preference, I agree with <a target="_blank" href="https://x.com/@SamBroner">@SamBroner</a> comments on stablecoins being the <a target="_blank" href="https://x.com/SamBroner/status/1867258548377596171">cheapest way to send a dollar, with profitability unlock &amp; a new platform</a>.</p>
<h3 id="heading-ease-of-use"><strong>Ease of Use</strong></h3>
<p>Convenience is the crux of it all.</p>
<p>What's the point of fast transactions, cheap fees, and an impressive permissionless composable infrastructure if users can't use it within real-world use cases like remittances and cross-border payments?</p>
<p>That's why I believe the Aptos Card, thanks to <a target="_blank" href="https://x.com/thisisarculus">@thisisarculus</a> and <a target="_blank" href="https://x.com/AptosLabs">@AptosLabs</a> collaboration, can unlock an interesting opportunity for users paying with real-world assets.</p>
<p>Similar to how I use <a target="_blank" href="https://x.com/@wirexapp">@wirexapp</a> card and now recently <a target="_blank" href="https://x.com/@KAST_official">@KAST_official</a> cards for real-world transactions.</p>
<p>More than that, thanks to the recent partnership with <a target="_blank" href="https://x.com/@stripe">@stripe</a> and <a target="_blank" href="https://x.com/@circle">@circle</a>, there could be exciting <a target="_blank" href="https://www.coindesk.com/business/2024/11/21/payments-giant-stripe-brings-crypto-services-to-aptos-as-circles-usdc-stablecoin-launch-on-the-network">on-ramp support for USDC on Aptos via Stripe.</a></p>
<p>This would make transactions easier and more convenient for the real-world use case.</p>
<h3 id="heading-stablecoins-move-better-on-aptos"><strong>Stablecoins Move Better on Aptos</strong></h3>
<p>Stablecoins have already proven they can move billions of dollars across borders faster and more cheaply than traditional rails and also accrue rewards for users.</p>
<p>Over-collateralized, yield-bearing stablecoin like MOD (Move Dollar) is specifically designed with reward mechanisms.</p>
<p>With over $2.3 B in traded volume, users can also stake MOD in the Thala Stability Pool (TSP) to earn rewards easily.</p>
<p><img src="https://pbs.twimg.com/media/Ge4F6ecWAAAw3vp?format=jpg&amp;name=small" alt="Image" class="image--center mx-auto" /></p>
<p>My thoughts are that as adoption grows, we could have an efficient way for users to exchange both information and economic value more efficiently and fairly.</p>
<p>For businesses, the <strong>bottom-line impact</strong> is profound: stablecoins can save billions in fees and expand cross-border trade. For users, stablecoins offer a simple, low-cost way to send value globally, cutting out an underbrush of banking intermediaries.</p>
<p>For developers, Aptos’ parallel execution engine and the Move programming language provide exciting ground for building next-gen DeFi applications, orchestrating stablecoin solutions, and bridging Web2 to Web3.</p>
<p><strong>Ultimately, stablecoins move better on Aptos</strong></p>
<p><strong>Data Reference:</strong></p>
<ol>
<li><p><a target="_blank" href="https://flipsidecrypto.xyz/studio/dashboards/b264fd20-afa2-412e-8cbc-82cddde83426?beta">Aptos: Layer 1 Engineered To Evolve</a></p>
</li>
<li><p><a target="_blank" href="https://flipsidecrypto.xyz/optimus_prime/mod-stablecoin-performance-analysis-7cFTPR">$MOD Stablecoin Performance Analysis</a></p>
</li>
<li><p><a target="_blank" href="https://www.ournetwork.xyz/p/on-298-stablecoins">ON–298: Stablecoins</a></p>
</li>
<li><p><a target="_blank" href="https://flipsidecrypto.xyz/0xHaM-d/aptos-stable-coin-supply-PxokVd">Aptos || Stable Coin Supply</a></p>
</li>
</ol>
<p><a target="_blank" href="https://x.com/0xOptimusPrime/status/1868690880581812371">  
</a></p>
]]></content:encoded></item><item><title><![CDATA[Smart Contracts Explained: A Comprehensive Guide for Data Analysts]]></title><description><![CDATA[When I started querying blockchain data, I knew SQL. I didn't know why some things showed up in tables and others didn't. Why could I find every token transfer but not historical balances? Why did som]]></description><link>https://www.0xoptimusprime.com/smart-contracts-explained-a-comprehensive-guide-for-data-analysts</link><guid isPermaLink="true">https://www.0xoptimusprime.com/smart-contracts-explained-a-comprehensive-guide-for-data-analysts</guid><category><![CDATA[Blockchain]]></category><category><![CDATA[Blockchain technology]]></category><category><![CDATA[blockchain data]]></category><dc:creator><![CDATA[Olusegun Aborode]]></dc:creator><pubDate>Thu, 08 Feb 2024 09:00:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766008294733/482e31ac-e27d-4708-aa44-3deca41a18da.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When I started querying blockchain data, I knew SQL. I didn't know why some things showed up in tables and others didn't. Why could I find every token transfer but not historical balances? Why did some contracts have clean decoded tables while others required parsing raw hex?</p>
<p>You see, I thought I could query a smart contract like a database. Pull up a wallet, check its balance at any point in time, done. But I quickly found out that is not how it works. With Flipside and Dune feeling really daunting to use at inception, I had to learn that the gap was in my understanding of smart contract fundamentals.</p>
<p>Once I understood how contracts actually work, what creates on-chain records versus what doesn't, everything clicked. This is what I wish someone had told me earlier when it came to data. Okay, here’s what I mean:</p>
<p>A smart contract is a program on a blockchain. Unlike server software, it runs on thousands of computers simultaneously and can't be modified once deployed. Think of it as a vending machine:</p>
<ul>
<li><p>The machine is the contract</p>
</li>
<li><p>The buttons are functions</p>
</li>
<li><p>The receipt is an event</p>
</li>
<li><p>The product is the state change (tokens moving, balances updating)</p>
</li>
</ul>
<p>The receipt is what we query as data analysts and engineers. However, the machine's internal state is harder to access directly.</p>
<h3>My first real task was simple</h3>
<p>Find the balance of a wallet at a specific block height from six months ago. I wrote what I thought was a straightforward query, looking for some kind of balance history table. Nothing came back.</p>
<p>I spent an hour thinking my SQL was wrong before I realized the data I was looking for did not exist. Not because it was deleted or hidden, but because it was never recorded in the first place.</p>
<p>Smart contracts have a function called balanceOf() that returns the current balance of any address. But here is the thing: calling that function does not create a transaction. It does not write anything to the blockchain. It just reads the current state and returns it. There is no log, no receipt, no record that the call ever happened.</p>
<p>That was my first real lesson. Read functions are invisible. They answer questions in the moment but leave no trace for analysts to find later.</p>
<h3>Why Some Data Exists, and Some Doesn't</h3>
<p>This confused me initially. The answer lies in understanding functions.</p>
<ul>
<li><p>Write functions that modify the blockchain. When called, they create a transaction with a hash, block number, timestamp, and gas cost. This is queryable data.</p>
</li>
<li><p>Read functions only retrieve the current state. They're free to call, create no transaction, and leave no on-chain trace.</p>
</li>
</ul>
<p>This is why you can't directly query "what was wallet X's balance at block Y" in Dune. That's a read function result. But you <em>can</em> reconstruct it by aggregating Transfer events; every balance change leaves a receipt. That’s the trick.</p>
<h3>To understand this better, think of it like a Vending Machine</h3>
<p>Someone gave me an analogy that made everything fall into place.</p>
<p>Think of a smart contract like a vending machine. You walk up, put in money, press a button, and get a snack. The machine has rules built into it. If you pay the right amount and press a valid button, you get your product. No negotiation, no human intervention.</p>
<p>Now think about what gets recorded:</p>
<ul>
<li><p>Pressing a button that dispenses a snack? That is a write function. Something changed. The machine's inventory went down, your money went in, and there is a receipt.</p>
</li>
<li><p>Looking at the display to see what is available? That is a read function. You got information, but nothing changed. No receipt.</p>
</li>
</ul>
<p>As a data analyst, you only have access to the receipts. This reframed everything for me. I stopped looking for data that was never written and started focusing on what was: the events.</p>
<h3>Events: The Analyst's Primary Data Source</h3>
<p>Events are the primary data source for blockchain analytics. They are structured logs that contracts emit when something happens.</p>
<p>The standard ERC-20 Transfer event looks like this:</p>
<p>Plain Text</p>
<p><code>Transfer(address indexed from, address indexed to, uint256 value)</code></p>
<p>Every time tokens move, this event fires. It records who sent, who received, and how much. This is what I query in Dune.</p>
<p>The word "indexed" matters. Indexed parameters become "topics" that I can filter on efficiently. The Transfer event has <code>from</code> and <code>to</code> indexed, which means I can quickly find all transfers to or from a specific address.</p>
<p>In Ethereum logs:</p>
<ul>
<li><p><code>topic0</code> is the event signature (identifies which event type)</p>
</li>
<li><p><code>topic1</code> is the first indexed parameter (<code>from</code>)</p>
</li>
<li><p><code>topic2</code> is the second indexed parameter (<code>to</code>)</p>
</li>
<li><p><code>data</code> contains everything else (<code>value</code>)</p>
</li>
</ul>
<p>When I write a query to find all transfers to a specific wallet, I am filtering on <code>topic2</code>.</p>
<h3>Function Selectors and Event Signatures</h3>
<p>Every function and event has a unique identifier. These are hashes that look like 0xa9059cbb for functions or <code>0xddf252ad...</code> for events. For the <code>transfer(address,uint256)</code> function:</p>
<ul>
<li><p>Take the function signature: <code>transfer(address,uint256)</code></p>
</li>
<li><p>Hash it with <code>keccak256</code></p>
</li>
<li><p>Take the first 4 bytes: <code>0xa9059cbb</code></p>
</li>
</ul>
<p>This selector is included in every transaction that calls this function. So if I want to find all transfer calls (not just the events, but the actual function calls), I can filter transactions by this selector.</p>
<p>For events, it is similar but uses the full 32-byte hash as <code>topic0</code>.</p>
<p><mark>[Updating what I do now]: Now, I do not calculate these by hand. Tools like </mark> <a href="http://Herd.eco"><mark>Herd.eco</mark></a> <mark> show them for every function and event, which saves time. Previously, when this article was written, you’d need to do it manually.</mark></p>
<h3>Another Thing I Learned About Addresses</h3>
<p>Ethereum addresses are case-insensitive, but SQL comparisons might not be. I ran into issues early on where queries returned nothing because of case mismatches.</p>
<p>Now I always normalize addresses to lowercase or use the native bytea format in DuneSQL:</p>
<p><code>-- This works in DuneSQL   WHERE contract_address = 0x7e2ac793f3E692f388e66c7DC28F739d13B0B71A</code></p>
<p>Another thing: addresses in topics are padded to 32 bytes. So a 20-byte address gets 12 bytes of zeros in front. This matters when working with raw logs. I assumed that a contract address always meant the same code. That is not true for upgradeable contracts.</p>
<p>Many contracts use a proxy pattern. There are two contracts:</p>
<ol>
<li><p>The proxy holds the state (balances, etc.) and has the address everyone uses</p>
</li>
<li><p>The implementation holds the logic and can be swapped out</p>
</li>
</ol>
<p>When the team upgrades the contract, they deploy new logic and point the proxy to it. The address stays the same, but the code changes.</p>
<p>For data analysis, this means historical data might have been generated by different code than what exists today. If I am analyzing a contract that has been upgraded, I need to check when the upgrades happened and whether they changed anything relevant to my analysis.</p>
<p>The Upgraded event tells me when implementation changes occurred. <mark>Now, when I use </mark> <a href="http://Herd.eco"><mark>Herd.eco</mark></a> <mark> to analyze a contract</mark>, I see emission counts for each event. Some show zero. At first, I thought this meant the event was not implemented. That is wrong. It means the event exists in the code but has never been triggered.</p>
<p>The zero-emission events are still important. They tell me what the contract can do, even if it has not done it yet.</p>
<p>For risk analysis, knowing that a contract has blacklisting capability matters even if no one has been blacklisted.</p>
<h3>How I Approach a New Contract Now</h3>
<p>When I need to analyze a new contract, here is my process:</p>
<p>Step 1: Identify the basics</p>
<ul>
<li><p>Is it upgradeable? (Check for proxy patterns)</p>
</li>
<li><p>What standard does it follow? (ERC-20, ERC-721, custom)</p>
</li>
<li><p>When was it deployed?</p>
</li>
</ul>
<p>Step 2: Map the functions</p>
<ul>
<li><p>What write functions exist?</p>
</li>
<li><p>Which ones have actually been called?</p>
</li>
<li><p>What access control is in place?</p>
</li>
</ul>
<p>Step 3: Map the events</p>
<ul>
<li><p>What events does it emit?</p>
</li>
<li><p>Which ones have emissions?</p>
</li>
<li><p>What data do they contain?</p>
</li>
</ul>
<p>Step 4: Build queries from events</p>
<ul>
<li><p>Start with the high-emission events</p>
</li>
<li><p>Reconstruct state from event history</p>
</li>
<li><p>Add filters for specific analysis</p>
</li>
</ul>
<p><mark>Tools like </mark> <a href="http://Herd.eco"><mark>Herd.eco</mark></a> make steps 2 and 3 fast. I can see all functions and events with their call counts and signatures in one view.</p>
<h3>Reconstructing Historical State</h3>
<p>Since I cannot query historical read function results directly, I reconstruct the state from events.</p>
<p>For token balances:</p>
<p><code>WITH transfers AS ( SELECT "to" as address, CAST(value AS DECIMAL(38,0)) as amount FROM erc20_ethereum.evt_Transfer WHERE contract_address = 0x7e2ac793f3E692f388e66c7DC28F739d13B0B71A UNION ALL SELECT "from" as address, -CAST(value AS DECIMAL(38,0)) as amount FROM erc20_ethereum.evt_Transfer WHERE contract_address = 0x7e2ac793f3E692f388e66c7DC28F739d13B0B71A ) SELECT address, SUM(amount) / 1e18 as balance FROM transfers WHERE address != 0x0000000000000000000000000000000000000000 GROUP BY address HAVING SUM(amount) &gt; 0 ORDER BY balance DESC</code></p>
<p>This gives me current balances. To get historical balances at a specific block, I add a block number filter:</p>
<h3>Mints and Burns Are Just Special Transfers</h3>
<p>This was a small realization, but it simplified my thinking.</p>
<p>A mint is a Transfer event where the from address is the zero address (0x000...000). Tokens are created from nothing.</p>
<p>A burn is a Transfer event where the to address is the zero address. Tokens are destroyed.</p>
<p>So I do not need separate mint and burn queries. I just filtered Transfer events:</p>
<p><code>-- Mints</code><br /><code>WHERE "from" = 0x0000000000000000000000000000000000000000 -- Burns WHERE "to" = 0x0000000000000000000000000000000000000000</code></p>
<p>Some contracts emit custom TokensMinted or TokensBurned events with additional data, but the Transfer event is always there.</p>
<h3>What I Wish I Knew Earlier</h3>
<ol>
<li><p>Read functions leave no trace. Do not look for data that does not exist.</p>
</li>
<li><p>Events are the data source. Learn to work with them.</p>
</li>
<li><p>Indexed parameters are filterable. Check which parameters are indexed before writing queries.</p>
</li>
<li><p>Upgradeable contracts change over time. Check for proxy patterns and upgrade history.</p>
</li>
<li><p>Zero emissions means unused, not missing. The capability exists even if it has not been used.</p>
</li>
<li><p>Addresses need normalization. Be defensive about the case and formatting.</p>
</li>
<li><p>Tools exist to speed this up. <mark>Herd.eco</mark>, Etherscan, and others show contract structure without reading code.</p>
</li>
<li><p>Historical state is reconstructed, not queried. Build it from events.</p>
</li>
</ol>
<h3>Resources That Helped Me</h3>
<ul>
<li><p><a href="https://dune.com/docs/">Dune Analytics Documentation</a> - Query patterns and table schemas</p>
</li>
<li><p><a href="https://etherscan.io/">Etherscan</a> - Contract verification and basic analysis</p>
</li>
<li><p><a href="https://herd.eco/"><mark>Herd.eco</mark></a> <mark> - </mark> Function and event mapping with call counts</p>
</li>
<li><p><a href="https://docs.openzeppelin.com/contracts/">OpenZeppelin Docs</a> - Standard contract patterns</p>
</li>
<li><p><a href="https://eips.ethereum.org/EIPS/eip-20">EIP-20</a> - The ERC-20 token standard</p>
</li>
</ul>
]]></content:encoded></item></channel></rss>