Update regarding Post-Processing Tools

edited July 2019 in DualSPHysics v4.4

Dear forum

A while back I posted about a post-processing tool I was working on in a coding language called Julia which has its own enviroment, GUI etc. and this post can be found here:


Now I just wanted to bring a little update since I've found some people have been asking how to read .bi4 files directly and this is now possible using this tool. You just have to be in the right directory of your data files and then mention the property you want like this:

idp_rhop = readBi4Array(PostSPH.Rhop)

And then it will give you the density of all particles in each time step of your simulation, fairly easily and straight forward. Using this tool on a normal SSD and i5 processor, I am able to read through about 1 gb of .bi4 in about 1 second +- 10%, so it is very nice for working with data interactively.

The tool it self can be found here; https://github.com/AhmedSalih3d/PostSPH.jl

I have not yet updated the readme.md to describe this new functionality but will do it as fast as possible. If anyone of you use both DualSPHysics and Julia I would love if you would test the tool and provide feedback. Currently the tool is developed primarily by me with assistance from my brother - we would love feedback.

Kind regards


  • Very interesting, we will definetly take a look


  • Thanks! If there is any questions feel free to ask, the goal is to make it usuable for the community.

  • Just sharing further progress. Finally finished first iteration of a function which reads the mk's in the xml file produced just after running the .bat file:

    This might seem redundant but it gives a good overview of the available mk's and makes it much easier to process the .bi4 files since everything is included in that. The parameters of "Body" is currently "mkbound","mk","begin","count","property" and "refmotion", while Bodies[1] to Bodies[4] represent fixed, moving, floating and fluid bodies. I've made a benchmark on the "CaseManyFloatings.xml" and the result is:

    So incredibly fast processing of the xml file in regards to mk and particles begin / count.

    Thanks for your attention.

  • Further small update, it is now possible after reading data to use this "MkArray" function as such:

    1. First you use the function and all available Mk's are listed (for example "MovingSquare"):

    Afterwards you can insert this information into the "readBi4Body" function as such:

    Which basically states that the array k will give us the position data (PostSPH.Points) for the moving body ie. the moving square.

    This means a lot of things since now in theory it is possible to make live-data-processing, but also now the workflow is much more solid, where you can have an interface to work with post-processing instead of having to go through a lot of different files.

    Everything is of course not perfect yet, but it is a step in the right direction :-)

  • Final update for a while, I just managed to make the initial progress for ensuring that it is possible to extract data while simulating and plotting it! Example can be seen here:

    Here we see the "CaseMovingSquare" in progress and while running the Julia script is checking whether or not new files (bi4) are being added, extracting the relevant data and adding it to plot. Quite neat if you ask me :-) x-axis is time in seconds and y-axis is x-position of the moving square.

    Kind regards

  • Dear Asalih3d,

    your work with Julia looks interesting and really helpful for DualSPHysics. On behalf of DualSPHysics Team, I would like to invite you to present your work to the next DualSPHysics Users Workshop that will be held in Barcelona in March 2020.

    The workshop details (venue, dates, registration fees, abstract submission, etc) can be found at the workshop web page. we will be pleased to see you in Barcelona.

    My regards


  • @iarba27 thank you very much for the invite. I have send some questions privately.

    Kind regards

  • edited March 8

    @Asalih3d Late to this. In fact I cannot seem to find an even earlier post in which you surveyed which post-processing features would have been the most helpful to users (very late, then). So I take this space instead.

    In my view, the game changer would be post-processing on the GPU: clearly once you can simulate (few to several) tens of millions of particles on a GPU, post-processing on CPU is a huge leap back in time.

    This drag is most evident when you compute derived flow quantities, in which the post-processor has to go through the data set once again and recompute neighbour lists and so forth. I am guessing (just guessing) that vorticity as for https://forums.dual.sphysics.org/discussion/comment/4103) is an example of this. Whichever the example, the bottleneck on the CPU post-processing is an incongruity with the GPU proper-processing.

    I heard (just heard) that Julia has a module for CUDA.

    Hope this helps.

  • Thanks for your comment.

    Indeed you are right that GPU programming is increasing in use and also ease of use, for an example through Julia. Unfortunately I do not as of now have the time or option to delve into this at such a large scale, but I think you are right in that in the future perhaps GPU post-processing could be a break-through.

    Would require some rewriting of algorithms and bottlenecks in transferring data around in the system.

    Kind regards

Sign In or Register to comment.