Skip to content

feat: Add Spark wallet support with Breez Nodeless SDK#841

Open
aljazceru wants to merge 1 commit intocashubtc:mainfrom
nostr-net:breez-spark-backend
Open

feat: Add Spark wallet support with Breez Nodeless SDK#841
aljazceru wants to merge 1 commit intocashubtc:mainfrom
nostr-net:breez-spark-backend

Conversation

@aljazceru
Copy link
Copy Markdown

  • Adds SparkWallet backend
  • Fixes docker-compose.yaml file to only default to fakewallet if the env variables are not present

@callebtc
Copy link
Copy Markdown
Collaborator

nice one!

@callebtc callebtc added needs review enhancement New feature or request mint About the Nutshell mint lightning Lightning network labels Nov 30, 2025
assert settings.mint_spark_api_key, "MINT_SPARK_API_KEY not set"
assert settings.mint_spark_mnemonic, "MINT_SPARK_MNEMONIC not set"

network_name = getattr(settings, "mint_spark_network", "mainnet").lower()
Copy link
Copy Markdown
Contributor

@TheRealCheebs TheRealCheebs Dec 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have these defaults already defined in settings.py:

    mint_spark_network: str = Field(default="mainnet")
    mint_spark_storage_dir: str = Field(default="data/spark")
    mint_spark_connection_timeout: int = Field(default=30)
    mint_spark_retry_attempts: int = Field(default=3)

We should manage them in one place.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yeah, good catch, i'll remove the duplicates

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@TheRealCheebs I've finally looked into this properly, technically its following patterns from other backends that have settings define in settings.py but actually validate the needed credentials in their respective files

macaroon = settings.mint_corelightning_rest_macaroon

if not endpoint:

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think those two are showing the pattern you have for the mint_spark_api_key and mint_spark_mnemonic; raising an exception or asserting when nothing is set. What I am calling out is you set defaults in settings.py for mint_spark_network, mint_spark_storage_dir,... , then you have defaults set again in the third parameter of getattr. The getattr defaults should be dropped as they are duplicates, and defaults should be managed in one place.

Comment on lines +12 to +15
- MINT_BACKEND_BOLT11_SAT=${MINT_BACKEND_BOLT11_SAT:-FakeWallet}
- MINT_LISTEN_HOST=${MINT_LISTEN_HOST:-0.0.0.0}
- MINT_LISTEN_PORT=${MINT_LISTEN_PORT:-3338}
- MINT_PRIVATE_KEY=${MINT_PRIVATE_KEY:-TEST_PRIVATE_KEY}
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unrelated changes?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah kinda, it makes the docker file actually work properly (vars taken from .env with sane defaults). otherwise you either need docker-compose.override.yaml or you always have a diff of the file in the repo. so its an improvemen, but it should possibly be in a separate PR :)

COPY . .
RUN poetry config virtualenvs.create false
RUN poetry install --no-dev --no-root

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

@ye0man ye0man added this to nutshell Jan 21, 2026
@github-project-automation github-project-automation bot moved this to Backlog in nutshell Jan 21, 2026
@a1denvalu3
Copy link
Copy Markdown
Collaborator

Vulnerability Report: Missing Fee Limit Enforcement in Spark Backend Allows Node Drain

Description

The breez-spark-backend branch introduces the SparkWallet lightning backend (cashu/lightning/spark.py). However, its pay_invoice method completely ignores the fee_limit_msat parameter passed to it by the Mint's ledger.py.

When a user requests a melt quote, the Mint uses get_payment_quote to fetch an estimated routing fee. The user then provides Cashu tokens covering the invoice amount plus this estimated fee_reserve. When the user proceeds to pay the quote, the Mint calls backend.pay_invoice(quote, fee_limit_msat=fee_reserve * 1000).

In the SparkWallet, pay_invoice ignores this limit and recalculates the route from scratch:

    async def pay_invoice(
        self, quote: MeltQuote, fee_limit_msat: int
    ) -> PaymentResponse:
        try:
            sdk = await self._sdk_instance()
            prepare_request = PrepareSendPaymentRequest(
                payment_request=quote.request,
                amount=None,
            )
            # Re-calculates route and fee!
            prepare_response = await sdk.prepare_send_payment(
                request=prepare_request
            )
            
            options = SendPaymentOptions.BOLT11_INVOICE(
                prefer_spark=False, completion_timeout_secs=30
            )
            # Sends the payment with the newly computed fee, without checking fee limit
            request = SendPaymentRequest(
                prepare_response=prepare_response, options=options
            )
            response = await sdk.send_payment(request=request)

Because prepare_response.fees_sats is never compared against fee_limit_msat, the Spark node will blindly pay whatever routing fee the new path requires, even if it vastly exceeds what the user paid to the Mint.

Exploitation (PoC)

An attacker can exploit this to drain the Mint's Spark node balance via exorbitant routing fees:

  1. The attacker sets up a Lightning node (Destination) and a routing node (Router). They create a channel from the Mint to Router, and Router to Destination.
  2. The attacker generates an invoice on Destination for 100 sats.
  3. The attacker configures Router to charge 0 sats fee.
  4. The attacker requests a melt quote from the Mint for this invoice. SparkWallet.get_payment_quote calculates a route with a 0 sat fee and returns a quote with the default fallback fee (e.g. 2 sats).
  5. The attacker accepts the quote, preparing to pay 102 sats in Cashu tokens.
  6. Before executing the melt, the attacker dynamically updates the fee on their Router node to an arbitrarily high amount, e.g., 1,000,000 sats.
  7. The attacker executes the melt. SparkWallet.pay_invoice calls prepare_send_payment again. The new route requires a 1,000,000 sat routing fee.
  8. SparkWallet sends the payment anyway, paying the 100 sats invoice + 1,000,000 sats routing fee to the attacker.
  9. The Mint loses 1,000,000 sats, while the attacker only paid 102 sats worth of Cashu tokens.

Mitigation

In cashu/lightning/spark.py, enforce the fee_limit_msat in pay_invoice by checking the computed fee against the limit before sending the payment:

            prepare_response = await sdk.prepare_send_payment(
                request=prepare_request
            )

            estimated_fee = getattr(prepare_response, "fees_sats", None) or getattr(prepare_response, "fees", 0)
            if int(estimated_fee) * 1000 > fee_limit_msat:
                return PaymentResponse(
                    result=PaymentResult.FAILED,
                    error_message=f"Fee exceeded limit: {estimated_fee} sats > {fee_limit_msat / 1000} sats"
                )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request lightning Lightning network mint About the Nutshell mint needs review

Projects

Status: Backlog

Development

Successfully merging this pull request may close these issues.

5 participants