file is not supported as it is higher than 4GB

Hi there,

I have encountered an error message in one of my simulations and I was unable to find any previous forum posts reporting a similar issue.

The specific error message is:

*** Exception (JBinaryData::CheckFileHead) at ..\source\JBinaryData.cpp:1231

Text: The size of file is not supported as it is higher than 4GB.

File: Trommel_out/Trommel_NormalData.nbi4

This simulation imports an STL geometry, converts to hdp then uses that to calculate normal vectors needed for mDBC. The generated binary file (Trommel_NormalData.nbi4) is indeed larger than 4GB.

My question is whether there is a particular reason why binary files are restricted to be less than 4GB? Is this a legacy of 32-bit operating systems or a necessary restriction for other reasons?

I'm happy to provide a copy of my simulation definition file if it will help.

Any suggestions or advice will be very much appreciated.

Cheers,

Dion

Comments

  • @jmdalonso can reply to that

  • I already asked the same question. This problem appears when you use a complex geometry in combination with mDBC. When you use

    to create the normals, they will be saved in the NormalData.nbi4 file. When you are using a complex geometry, the NormalData.nbi4 file becomes really big until it reaches a limit (obviously 4GB but I already had .nbi4 files bigger than that) and the simulation crashes. You can avoid this by using a smaller value for "distanceh" but then you basically get holes in your geometry. My conclusion was, that you cannot use mDBC for very complex geometries yet, but correct me if I'm wrong

  • Thanks Zwulch.

    I checked and my distanceh was 3. When I changed to 2, I snuck under the 4Gb limit and simulation ran properly.

    Of course this isn't a solution to the underlying problem that complex geometries are likely problematic when using mDBC. I expect to hit the limit again very soon so intend to escalate this issue. I really would like to understand why there is a hard limit on the size of these files.

    Cheers,

    Dion

  • Guessing a bit, but I suppose the hard limit might be something with where in RAM the normals are to be kept?

    Hopefully it can be resolved though, since it seems weird being limited by this.

    Kind regards

  • Thank you for sharing this issue with us.

    Yes, the size of the NormalData.nbi4 file may be a problem in version 5 of the model.

    This issue is already solved in the next version of the model (v5.2) that we are preparing and will be released in a few months. In fact, a version 5.2 BETA will be released during the 6th DualSPHysics Workshop, but only for workshop attendees. This version will fix some issues and include significant new features. All the information is available at https://dual.sphysics.org/6thworkshop/

    Best regards

  • I was recently doing a case with a twisted king block, hoping to use the mDBC boundary condition, please directly fill the boundary model with particles, can this be created successfully?

  • Zhuzhu you can literally find the answer to your question in this thread :D

Sign In or Register to comment.