You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we use std::getline and std::string related functions for input parsing. However, allocating memory on each invocation of std::getline for each line in a 100 K - 1 M line Abaqus or VTK file is not ideal. Therefore, I think it would be likely be faster to use fgets() and a fixed size buffer for reading files.
The primary issue with this is related to buffer/line size. The buffer must be large enough to handle the max line size. Maybe we just allocate something like 4096 and see how it goes. We do not currently handle files with anywhere near that large of lines.
gmsh/io.cpp
mesh/polytope_soup.cpp
physics/cross_section_library.cpp
The text was updated successfully, but these errors were encountered:
Implemented for all but gmsh/io.cpp, since the files containing a gmsh model are not very large, hence it would not have been worth the refactor effort. Merged to main with #156
Currently, we use std::getline and std::string related functions for input parsing. However, allocating memory on each invocation of std::getline for each line in a 100 K - 1 M line Abaqus or VTK file is not ideal. Therefore, I think it would be likely be faster to use fgets() and a fixed size buffer for reading files.
The primary issue with this is related to buffer/line size. The buffer must be large enough to handle the max line size. Maybe we just allocate something like 4096 and see how it goes. We do not currently handle files with anywhere near that large of lines.
The text was updated successfully, but these errors were encountered: