Skip to content

Feat/collapse payments csv#942

Merged
Klakurka merged 26 commits intomasterfrom
feat/collapse-payments-csv
Feb 26, 2025
Merged

Feat/collapse payments csv#942
Klakurka merged 26 commits intomasterfrom
feat/collapse-payments-csv

Conversation

@lissavxo
Copy link
Copy Markdown
Collaborator

Related to #934

Description

Change the CSV logic to collapse payments with value smaller than $1 USD in the same day.
Added two new columns to the file Collapsed and Notes.
If the line contains payment Collapsed the column will be true and in Notes we will display "<button-name> - <payments-collapsed-count> transactions"

Test plan

  • Test with one address that has a bunch of payments smaller than $1 USD se if the behavior of collapsing is working
  • Also test with a list of payments smaller than $1 USD and a larger payment between those payments, the behavior should be 3 lines 2 collapsed and 1 not collapsed following the order of the date.

@lissavxo lissavxo requested a review from Klakurka February 10, 2025 15:52
@Klakurka
Copy link
Copy Markdown
Member

Tests failing.

Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Let's remove the 'Collapsed' column from the CSV.
  • Let's put all of the transaction Ids into the tx id column, comma separated (assuming that's possible).
  • How about adding some tests around making sure the amounts and values add up correctly (as in the same amount and value totals as when not collapsed).


export const DEFAULT_PAYBUTTON_CSV_FILE_DELIMITER = ','
export const MAX_RECORDS_PER_FILE = 10000
export const DEFAULT_COLLAPSE_THRESHOLD_FILE_USD = 1
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about changing this from USD to DEFAULT_CSV_COLLAPSE_THRESHOLD and in a follow-up branch, we can make this configurable in the account settings? The threshold will use whatever currency they have set, not just USD.

@Klakurka Klakurka requested a review from chedieck February 10, 2025 16:36
@lissavxo
Copy link
Copy Markdown
Collaborator Author

  • Let's remove the 'Collapsed' column from the CSV.
  • Let's put all of the transaction Ids into the tx id column, comma separated (assuming that's possible).
  • How about adding some tests around making sure the amounts and values add up correctly (as in the same amount and value totals as when not collapsed).

The CSV uses comma to separate the columns so separating the tx Ids with commas will mess up the file, can we do semicolon (;) ?

@Klakurka
Copy link
Copy Markdown
Member

  • Let's remove the 'Collapsed' column from the CSV.
  • Let's put all of the transaction Ids into the tx id column, comma separated (assuming that's possible).
  • How about adding some tests around making sure the amounts and values add up correctly (as in the same amount and value totals as when not collapsed).

The CSV uses comma to separate the columns so separating the tx Ids with commas will mess up the file, can we do semicolon (;) ?

Sure we can go with that.

@lissavxo lissavxo requested a review from Klakurka February 12, 2025 19:16
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • I know some may not want this but for us, we definitely need this to be done in both the Payments and Transactions CSVs. We'll eventually make this configurable.
  • Let's do the same thingh we're doing with txids with addresses (put them all in but semicolon-separated).

@lissavxo lissavxo requested a review from Klakurka February 13, 2025 22:27
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Something seems to be breaking the formatting: image
  • The 'Amount' column is being rounded. We want to keep the decimal places (2 for XEC, 8 for BCH).
    I'm testing w/ EL addresses.

@lissavxo
Copy link
Copy Markdown
Collaborator Author

  • Something seems to be breaking the formatting: image
  • The 'Amount' column is being rounded. We want to keep the decimal places (2 for XEC, 8 for BCH).
    I'm testing w/ EL addresses.

this is due to opening CSV in excel, changed the delimiter to semicolon (;) it will make it readable in excel

@lissavxo lissavxo requested a review from Klakurka February 14, 2025 18:03
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue again

image

Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still an issue:

image

Looks like the problem is that there's a max length on each row (or cell).

I think I'm ok with just omitting the tx ids if there's multiple. Businesses are unlikely going to need to reference the tx id on small payments. The important ones will be the big ones. If there's more than one tx (collapsed row), just say 'Multiple'.

@lissavxo lissavxo requested a review from Klakurka February 19, 2025 15:10
@lissavxo lissavxo requested a review from Klakurka February 19, 2025 20:51
@lissavxo lissavxo force-pushed the feat/collapse-payments-csv branch from 440ce76 to bc12ff9 Compare February 19, 2025 20:56
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went through and did a more thorough check of the transactions and it appears that we're missing a bunch.

Here's some dates where we should have transactions and we somehow don't in the CSV (for the 5 eCash Land addresses; 6733 txs total):

  • 2024-02-13
  • 2024-03-06
  • 2024-05-23
  • 2024-05-30
  • 2024-06-11
  • 2024-08-23
  • 2024-10-17
  • 2024-10-21
  • 2024-11-19
  • 2025-01-09

@lissavxo lissavxo force-pushed the feat/collapse-payments-csv branch from bc12ff9 to 93a681b Compare February 21, 2025 20:46
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tx count lines up now, but the amounts and values don't.

For example, 2024-10-19 should be 2068.67 XEC, not 2069 XEC (seemingly rounded).

@lissavxo lissavxo force-pushed the feat/collapse-payments-csv branch from cd9e4e4 to e3f6f34 Compare February 22, 2025 01:00
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exchange rates are sometimes rounding to 8 decimal places when they should be 14 (which is what the CD API returns).

@lissavxo lissavxo requested a review from Klakurka February 25, 2025 22:50
@lissavxo lissavxo force-pushed the feat/collapse-payments-csv branch from 0dfa4c7 to 933ce99 Compare February 26, 2025 00:11
Klakurka
Klakurka previously approved these changes Feb 26, 2025
Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good - in production, I'm going to spend some extra energy making sure everything is as expected before closing the task.

Copy link
Copy Markdown
Collaborator

@chedieck chedieck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Requesting:

  • bug fix for 8 decimal places on fiat value if dealing with a bch address,
  • avoiding possible float precision errors,
  • naming improvements,
  • other refactors

utils/files.ts Outdated

const getPaybuttonTransactionsFileData = (transactions: TransactionsWithPaybuttonsAndPrices[], currency: SupportedQuotesType, timezone: string): TransactionFileData[] => {
const paymentsFileData: TransactionFileData[] = []
transactions.forEach(element => {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

element is a bad name for what you know is a transaction. Any array has elements, this is the variable-naming equivalent of typing something as any. Just call it tx or transaction.

utils/files.ts Outdated
const { amount, hash, address, timestamp } = element
const value = getTransactionValueInCurrency(element, currency)
const date = moment.tz(timestamp * 1000, timezone)
const rate = value / amount.toNumber()
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn't find any float precision error myself, but I don't feel confortable knowing it can happen.

We do have this price rate in the DB, no need to recalculate it from the value / amount, since we literally calculated the value from this price.

You can use something like

const price = tx.prices.find(p => p.price.quoteId = QUOTE_IDS[currency.toUpperCase()])!.price.value

utils/files.ts Outdated
const { timestamp, hash, address, amount } = tx
const values = getTransactionValue(tx)
const value = Number(values[currency])
const rate = value / Number(amount)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same thing I commented below about float precision errors

utils/files.ts Outdated
collapseThreshold: number
): TransactionFileData[] => {
const treatedPayments: TransactionFileData[] = []
const tempGroups: Record<string, TransactionsWithPaybuttonsAndPrices[]> = {}
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tempTxGroups is more clear

utils/files.ts Outdated
let totalPaymentsTreated = 0

const pushTempGroup = (groupKey: string): void => {
const tempGroup = tempGroups[groupKey]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tempTxGroup here

utils/files.ts Outdated
}
const totalAmount = tempGroup.reduce((sum, p) => sum + Number(p.amount), 0)
const totalValue = tempGroup.reduce((sum, p) => sum + Number(getTransactionValue(p)[currency]), 0)
const rate = totalValue / totalAmount
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same thing I commented below about float precision error, with the difference that here we have many txs. These txs, however, all belong to the same UTC day, so they should have the same price. What you could do is

const uniquePrices = new Set(
  tempTxGroup
    .map(tx => tx.prices.find(p => p.price.quoteId === QUOTE_IDS[currency.toUpperCase()])!.price.value)
);

and then

if (uniquePrices.length !== 1) {
// Should never come here
throw new Error(RESPONSE_MESSAGES.500_INVALID_PRICES_FOR_TX_ON_CSV_CREATION(uniquePrices.length))
}

const rate = uniquePrices[0]

utils/files.ts Outdated
amount: amount.toFixed(DECIMALS[networkTicker]),
date: date.format(PRICE_API_DATE_FORMAT),
value: value.toFixed(2),
value: value.toFixed(DECIMALS[networkTicker]),
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is wrong, will lead to USD/CAD values having 8 decimal digits if the networkId is 2, for BCH.

Should be DECIMALS.FIAT.

let networkIdArray = Object.values(NETWORK_IDS)
if (networkTicker !== undefined) {
const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker)
const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you defaulting to ecash? Maybe if slug is invalid, we should just throw an error, we already have INVALID_NETWORK_SLUG_400.

Also could have const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[ticker]]

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this same codeblock appears right above downloadTxsFile, on both the only two places this function is ever called. Why not put it inside it, then?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this same codeblock appears right above downloadTxsFile, on both the only two places this function is ever called. Why not put it inside it, then?

Because the networkIdArray is used to fetch the txs something we are doing outside downloadTxsFile function

if (networkTicker !== undefined) {
const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker)
const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash)
const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[networkTicker]]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just noticed: why get the slug from the id if you're gonna get the id from the slug? if the goal is to get the id, just forget the slug, check if the ticker is valid (we have a variable with the valid network tickers) and then get the id with NETWORK_IDS[networkTicker]

if (networkTicker !== undefined) {
const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker)
const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash)
const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[networkTicker]]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same thing here

FAILED_TO_FETCH_PRICE_FROM_API_500: (day: string, ticker: string) => { return { statusCode: 500, message: `Failed to fetch ${ticker} price for day ${day}` } },
MISSING_WS_AUTH_KEY_400: { statusCode: 400, message: 'Missing WS_AUTH_KEY environment variable' },
MISSING_PRICE_FOR_TRANSACTION_400: { statusCode: 400, message: 'Missing price for transaction.' },
INVALID_PRICES_FOR_TX_ON_CSV_CREATION_500: (pricesLenght: number) => { return {statusCode: 500, message: `Missing price for transaction in CSV creation. ${pricesLenght}` }},
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable name and message are not right:

To be more precise, is the number of prices that is wrong, not the prices themselves. But more than that, this happens if there is a number != 1 of prices, so there could be missing prices or there could be more prices than expected. This error says "Missing price", which could not always be the case, and just throw a random number at the end without explain what it is. A better approach here would be:

INVALID_PRICES_AMOUNT_FOR_TX_ON_CSV_CREATION_500: (pricesLenght: number) => { return  {
statusCode: 500, 
message: `Wrong number of prices for transactions group in CSV creation. Expected 1, got ${pricesLenght}`. 
}},

@chedieck chedieck self-requested a review February 26, 2025 21:37
Copy link
Copy Markdown
Collaborator

@chedieck chedieck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving since this has been requested already for some time, but I think tests should improve in the future to also include the verification of the correctness of dividing same-utc-day lines into different lines across different timezones.

Copy link
Copy Markdown
Member

@Klakurka Klakurka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll approve but I agree on the tests needing to be better on follow-ups.

@Klakurka Klakurka merged commit bf6238b into master Feb 26, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants