Conversation
|
Tests failing. |
Klakurka
left a comment
There was a problem hiding this comment.
- Let's remove the 'Collapsed' column from the CSV.
- Let's put all of the transaction Ids into the tx id column, comma separated (assuming that's possible).
- How about adding some tests around making sure the amounts and values add up correctly (as in the same amount and value totals as when not collapsed).
constants/index.ts
Outdated
|
|
||
| export const DEFAULT_PAYBUTTON_CSV_FILE_DELIMITER = ',' | ||
| export const MAX_RECORDS_PER_FILE = 10000 | ||
| export const DEFAULT_COLLAPSE_THRESHOLD_FILE_USD = 1 |
There was a problem hiding this comment.
How about changing this from USD to DEFAULT_CSV_COLLAPSE_THRESHOLD and in a follow-up branch, we can make this configurable in the account settings? The threshold will use whatever currency they have set, not just USD.
The CSV uses comma to separate the columns so separating the tx Ids with commas will mess up the file, can we do semicolon (;) ? |
Sure we can go with that. |
Klakurka
left a comment
There was a problem hiding this comment.
- I know some may not want this but for us, we definitely need this to be done in both the Payments and Transactions CSVs. We'll eventually make this configurable.
- Let's do the same thingh we're doing with txids with addresses (put them all in but semicolon-separated).
Klakurka
left a comment
There was a problem hiding this comment.
Still an issue:
Looks like the problem is that there's a max length on each row (or cell).
I think I'm ok with just omitting the tx ids if there's multiple. Businesses are unlikely going to need to reference the tx id on small payments. The important ones will be the big ones. If there's more than one tx (collapsed row), just say 'Multiple'.
440ce76 to
bc12ff9
Compare
Klakurka
left a comment
There was a problem hiding this comment.
I went through and did a more thorough check of the transactions and it appears that we're missing a bunch.
Here's some dates where we should have transactions and we somehow don't in the CSV (for the 5 eCash Land addresses; 6733 txs total):
- 2024-02-13
- 2024-03-06
- 2024-05-23
- 2024-05-30
- 2024-06-11
- 2024-08-23
- 2024-10-17
- 2024-10-21
- 2024-11-19
- 2025-01-09
bc12ff9 to
93a681b
Compare
Klakurka
left a comment
There was a problem hiding this comment.
The tx count lines up now, but the amounts and values don't.
For example, 2024-10-19 should be 2068.67 XEC, not 2069 XEC (seemingly rounded).
cd9e4e4 to
e3f6f34
Compare
0dfa4c7 to
933ce99
Compare
Klakurka
left a comment
There was a problem hiding this comment.
Looks good - in production, I'm going to spend some extra energy making sure everything is as expected before closing the task.
chedieck
left a comment
There was a problem hiding this comment.
Requesting:
- bug fix for 8 decimal places on fiat value if dealing with a bch address,
- avoiding possible float precision errors,
- naming improvements,
- other refactors
utils/files.ts
Outdated
|
|
||
| const getPaybuttonTransactionsFileData = (transactions: TransactionsWithPaybuttonsAndPrices[], currency: SupportedQuotesType, timezone: string): TransactionFileData[] => { | ||
| const paymentsFileData: TransactionFileData[] = [] | ||
| transactions.forEach(element => { |
There was a problem hiding this comment.
element is a bad name for what you know is a transaction. Any array has elements, this is the variable-naming equivalent of typing something as any. Just call it tx or transaction.
utils/files.ts
Outdated
| const { amount, hash, address, timestamp } = element | ||
| const value = getTransactionValueInCurrency(element, currency) | ||
| const date = moment.tz(timestamp * 1000, timezone) | ||
| const rate = value / amount.toNumber() |
There was a problem hiding this comment.
I couldn't find any float precision error myself, but I don't feel confortable knowing it can happen.
We do have this price rate in the DB, no need to recalculate it from the value / amount, since we literally calculated the value from this price.
You can use something like
const price = tx.prices.find(p => p.price.quoteId = QUOTE_IDS[currency.toUpperCase()])!.price.value
utils/files.ts
Outdated
| const { timestamp, hash, address, amount } = tx | ||
| const values = getTransactionValue(tx) | ||
| const value = Number(values[currency]) | ||
| const rate = value / Number(amount) |
There was a problem hiding this comment.
Same thing I commented below about float precision errors
utils/files.ts
Outdated
| collapseThreshold: number | ||
| ): TransactionFileData[] => { | ||
| const treatedPayments: TransactionFileData[] = [] | ||
| const tempGroups: Record<string, TransactionsWithPaybuttonsAndPrices[]> = {} |
There was a problem hiding this comment.
tempTxGroups is more clear
utils/files.ts
Outdated
| let totalPaymentsTreated = 0 | ||
|
|
||
| const pushTempGroup = (groupKey: string): void => { | ||
| const tempGroup = tempGroups[groupKey] |
utils/files.ts
Outdated
| } | ||
| const totalAmount = tempGroup.reduce((sum, p) => sum + Number(p.amount), 0) | ||
| const totalValue = tempGroup.reduce((sum, p) => sum + Number(getTransactionValue(p)[currency]), 0) | ||
| const rate = totalValue / totalAmount |
There was a problem hiding this comment.
Same thing I commented below about float precision error, with the difference that here we have many txs. These txs, however, all belong to the same UTC day, so they should have the same price. What you could do is
const uniquePrices = new Set(
tempTxGroup
.map(tx => tx.prices.find(p => p.price.quoteId === QUOTE_IDS[currency.toUpperCase()])!.price.value)
);
and then
if (uniquePrices.length !== 1) {
// Should never come here
throw new Error(RESPONSE_MESSAGES.500_INVALID_PRICES_FOR_TX_ON_CSV_CREATION(uniquePrices.length))
}
const rate = uniquePrices[0]
utils/files.ts
Outdated
| amount: amount.toFixed(DECIMALS[networkTicker]), | ||
| date: date.format(PRICE_API_DATE_FORMAT), | ||
| value: value.toFixed(2), | ||
| value: value.toFixed(DECIMALS[networkTicker]), |
There was a problem hiding this comment.
This is wrong, will lead to USD/CAD values having 8 decimal digits if the networkId is 2, for BCH.
Should be DECIMALS.FIAT.
| let networkIdArray = Object.values(NETWORK_IDS) | ||
| if (networkTicker !== undefined) { | ||
| const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker) | ||
| const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash) |
There was a problem hiding this comment.
Why are you defaulting to ecash? Maybe if slug is invalid, we should just throw an error, we already have INVALID_NETWORK_SLUG_400.
Also could have const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[ticker]]
There was a problem hiding this comment.
Also, this same codeblock appears right above downloadTxsFile, on both the only two places this function is ever called. Why not put it inside it, then?
There was a problem hiding this comment.
Also, this same codeblock appears right above
downloadTxsFile, on both the only two places this function is ever called. Why not put it inside it, then?
Because the networkIdArray is used to fetch the txs something we are doing outside downloadTxsFile function
| if (networkTicker !== undefined) { | ||
| const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker) | ||
| const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash) | ||
| const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[networkTicker]] |
There was a problem hiding this comment.
Just noticed: why get the slug from the id if you're gonna get the id from the slug? if the goal is to get the id, just forget the slug, check if the ticker is valid (we have a variable with the valid network tickers) and then get the id with NETWORK_IDS[networkTicker]
pages/api/payments/download/index.ts
Outdated
| if (networkTicker !== undefined) { | ||
| const slug = Object.keys(NETWORK_TICKERS).find(key => NETWORK_TICKERS[key] === networkTicker) | ||
| const networkId = getNetworkIdFromSlug(slug ?? NETWORK_TICKERS.ecash) | ||
| const slug = NETWORK_SLUGS_FROM_IDS[NETWORK_IDS[networkTicker]] |
constants/index.ts
Outdated
| FAILED_TO_FETCH_PRICE_FROM_API_500: (day: string, ticker: string) => { return { statusCode: 500, message: `Failed to fetch ${ticker} price for day ${day}` } }, | ||
| MISSING_WS_AUTH_KEY_400: { statusCode: 400, message: 'Missing WS_AUTH_KEY environment variable' }, | ||
| MISSING_PRICE_FOR_TRANSACTION_400: { statusCode: 400, message: 'Missing price for transaction.' }, | ||
| INVALID_PRICES_FOR_TX_ON_CSV_CREATION_500: (pricesLenght: number) => { return {statusCode: 500, message: `Missing price for transaction in CSV creation. ${pricesLenght}` }}, |
There was a problem hiding this comment.
The variable name and message are not right:
To be more precise, is the number of prices that is wrong, not the prices themselves. But more than that, this happens if there is a number != 1 of prices, so there could be missing prices or there could be more prices than expected. This error says "Missing price", which could not always be the case, and just throw a random number at the end without explain what it is. A better approach here would be:
INVALID_PRICES_AMOUNT_FOR_TX_ON_CSV_CREATION_500: (pricesLenght: number) => { return {
statusCode: 500,
message: `Wrong number of prices for transactions group in CSV creation. Expected 1, got ${pricesLenght}`.
}},
chedieck
left a comment
There was a problem hiding this comment.
Approving since this has been requested already for some time, but I think tests should improve in the future to also include the verification of the correctness of dividing same-utc-day lines into different lines across different timezones.
Klakurka
left a comment
There was a problem hiding this comment.
I'll approve but I agree on the tests needing to be better on follow-ups.




Related to #934
Description
Change the CSV logic to collapse payments with value smaller than $1 USD in the same day.
Added two new columns to the file Collapsed and Notes.
If the line contains payment Collapsed the column will be
trueand in Notes we will display"<button-name> - <payments-collapsed-count> transactions"Test plan