Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large data volumes when doing a large data update without a file #1030

Open
cSmithBPA opened this issue Oct 8, 2024 · 1 comment
Open

Large data volumes when doing a large data update without a file #1030

cSmithBPA opened this issue Oct 8, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@cSmithBPA
Copy link

When working with large data volumes in the Update Records Without a File tool, it would be great to have the ability to 'Limit' the number of records that get run. I am trying to work with a data set of 9M and the browser cannot handle it seemingly.

Put a limit section when setting up the data update.

Screenshot 2024-10-08 at 13 25 59
@cSmithBPA cSmithBPA added the enhancement New feature or request label Oct 8, 2024
@paustint
Copy link
Contributor

paustint commented Oct 9, 2024

@cSmithBPA - Thank you for the feedback - I think this makes sense.
it would also be nice if we could figure out a way to handle large datasets - but it gets pretty complicated as data needs to be stored instead of trying to have it all in memory.

As an immediate workaround, you can always perform the bulk update from the query results page as well, and that will operate on the exact records that you have queried, including your limits/filters etc..

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants