โ07-11-2024 05:45 AM
I've found a problem when I've wanted to automate exporting my lookup tables and save them in S3. I have a simple formula for S3 content:
[Entries].pluck('entry').to_json
And what I was expecting to get from it is
{
"some_field1": 123,
"another_field": "1qwer",
}
In reality it replaces column names with col1, col2, col3, so it looks like that
{
"col1": 123,
"col2": "1qwe",
}
Is there any way to persist column names or map them back?
Solved! Go to Solution.
โ07-11-2024 08:59 AM
The quickest way to do this is to make a variable list with the key names you want and then map the LUT response into the list. It should only require one additional task.
โ07-11-2024 08:59 AM
The quickest way to do this is to make a variable list with the key names you want and then map the LUT response into the list. It should only require one additional task.
โ07-17-2024 05:13 AM
Yes, that was the solition, thanks!
โ07-11-2024 10:39 PM
Hi @Ralef ,
Check out this solution does this helps you or not, it will not take any additional task too.
Explanation:
We can define the column as per our requirement using 'format_map' at the end we can format using to_json.
Thanks and Regards,
Shivakumara Avadhani