pack
Credits: vux for dx11, antokhio / schnellebuntebilder / everyoneishappy for beta testing
This pack is a collection of tools and techniques to handle a lot of particles on the GPU.
It also includes tools for dealing with depth cameras like Kinect 1 or 2.
You need:
Installation:
After installation you can see all nodes of this pack by typing dx11.particles in your node browser. There are help patches for all included nodes that give small examples how to use them.
You can fork the project here: https://github.com/letmp/dx11-particles
I really appreciate all of your help to let this pack grow. It is quite easy to build new modifiers and selectors. So if you have something that should be part of the pack just get in contact via robert@intolight.de or create a pull request via github.
Please report any bugs here: https://github.com/letmp/dx11-particles/issues
© Robert Willner, 2018
Author: robert.willner@gmail.com
This software is distributed under the CC Attribution-NonCommercial-ShareAlike 4.0 license.
If you want to use this pack for a commercial project, please contact robert.willner@gmail.com and tell me about your project or your goal.
See store
Older Revisions
Older Revisions
anonymous user login
~1d ago
~1d ago
~9d ago
~11d ago
~12d ago
~16d ago
~16d ago
~23d ago
~30d ago
~30d ago
thanks tmp!
can not test right now because my kinect went on a vacation to china,
but this looks really nice.
besides, this made me wonder if packs should be used to bundle existing
contributions into useful and documented/girlpowered collections for specific use cases.
Thanks for sharing tmp
looks great
Only one thing I haven't projector(dx11) node...
reaktant this is an interesting point and indeed a good idea. there are for example alot of good helper modules (f.e. operandomatic,grid,...) that would be nice to have in a pack.
noir thanks for this hint. i created the projector(dx11) node some weeks ago and it is part of the dx11-girlpower package. (see https://github.com/mrvux/dx11-vvvv-girlpower)
unfortunately the latest vvvv-dx11 zip file is not up to date so all you have to do is replace the existing girlpower folder with the one you can download on github.
Ok i'll try to dwnld from github the girlpower and replace it
Thanks tmp
I was asked today if the nodes also work with more than 1 kinect. Yes it works ;)
All you have to do is to use the Cons (DX11.Texture 2d) node and merge the depth/rgb/rgbdepth output of each kinect instance.
You can then use the Validator (DX11.Layer) and GetSlice (DX11.Validator) nodes to filter the output layer of the Pointcloud node.
Confirm
Thanks again Tmp
there is a small update available on github:
https://github.com/letmp/KinectToolkitDX11/archive/master.zip
Hello,
I need some help. I want to calibrate kinect with a projector so i use the "calibration (kinect setup)help.v4p".
I follow the steps and everything seems to go fine. My problem is that i can't understand how i can apply the trasnformation on the joints positions of the "skeleton". I tried to use "applyTrasform" and inserted thw joints positions and the "transform out" from the "calibration" node, but it didn't work. Could someone help me?
Thanks in advance.
There's a comparison between evvvvil, everyoneishappy and tmp'S pointcloud performance innit ?
UPDATE
changes in the latest version (08.05.15):
@cunk111: i dont't know contributions of evvvvil or everyoneishappy that are related to pointclouds. so i cannot say anything to performance.
THX A LOT! this is a great tut about CS shaders as well. big ups!
I sense a disturbance in the Force.
UPDATE
NEW
FIXES
There are some features that need the latest build of dx11!
For those of you that are not able to build it: https://dl.dropboxusercontent.com/u/51232449/dx11_x86_x64_2015-08.zip
Have fun! Feedback appreciated! ;)
Guys, you really have to check out the forces-part of the pack. Is already a lot of fun and contributing of additional stuff highly appreciated. We have the next implementations on the list (verlet + constraints, boids, connectall, sorting, etc.). But if you have already something on your hard-drive, feel free to implement it in the pack and make a pull request. Lets build a(nother) powerful particle-libary!
Ah, and don't miss the MultiEmitterWithGroupedForces.v4p-girlpower patch. Eventbased behaviour-switching!
hey! I used it for my installation, thanks!
continuum-interactive-video-installation-in-vvvv
Nice work, Luper. I love the wired canvas :)
Would you mind adding some attribution to the documentation of your project to the benefit of intolight and tmp's work? It has been quite an investment of time and work to get the pointcloud pack to where it is now, hence the creative common license obligation of
I can see that this wasn't a project for the big bucks but for the arts, and we also appreciate you giving away the cellular shader as an appreciation in kind, so this might not apply to you at all, but for completeness:
If you want to earn money with it, don't want to mention our name in the context of your installation or give someone parts of the code under different licensing terms, just get into contact with us.
Easy, we don't bite. We just feel it unfair if someone takes the cake we baked without us having a share.
Hallo a couple of question, one is in the forum, pointcloud-layer-with-projector-node, there seems to be a problem when using the Projector node with shift-y as camera.
The second one, I see IDs are disabled in Pointcloud Layer, is there any way to assign different IDs to different objects coming into Pointcloud Layer ?
These objects are defined by a buffered transform IID with Instance Noodle pack.
tx
Simone
I uploaded a solution in your forum thread.
Unfortunately I cannot reply to your second question. Can you be a bit more specific? Or even better: upload a small patch that shows your problem.
Hi I ve modified the PointCloud Layer to accept transformations and groupIDs, now this works ok with normal transformation, I ll try some buffered transformation later.
Is it possible to separate position and colour of the particles? I d like to sample colours from the same object but 2 different textures, so I could for example make the object into particles, move them around, change the particles to the other colour from the second texture, compose the object back.
I assume I could just create 2 perfect copies of the PointCloud, one for each texture and apply the same forces but that would involve a double calculation of positions when all I want is just 2 sets of colours.
S.
http://s000.tinyupload.com/?file_id=65015626839706849971
Hello intolight team, and first of all thank you for this amazing work.
I'm doing a new project in my university (merging pointclouds from multiple kinects and send it to Cinema4D), and now I'm searching for the right tools.
I'm trying Processing, brekel and vvvv... so I'm wondering if it's theoretically possible to merge multiple kinects through bouygroup?
Like in your "MultipleKinectSetup" example, just on different mashines...
I'm new in vvvv, so I can't estimate if this is possible.
It would be great if you could give me some thoughts!!!
Best regards
neoshaman
edit: so it's for kinectV2
hey neoshaman,
to make things short, it is possible to merge multiple kinects (connected to various machines) with pointcloud, but not necessarily with a boygroup.
as you might know, a boygroup can only "distribute" parts of your output to multiple clients, but is not designed to "collect" data from clients at all. so boygrouping might be useful to actually configure kinects over network, but that's about it.
to "collect" various pointclouds into a single one, you have to use a custom network patch. at intolight we successfully used ZeroMQ to that end.
Hello velcrome and much thanks for the information.
I have some additional questions for this:
Do I need this one for merging in realtime? Would it be possible to record the data separately an merge it afterwards?
What else do I need.. or could you briefly give kind of workflow?
Sorry for the beginners questions... I really don't know where to start.
Best regards
Hi tmp
I'm trying to use Writer (DX11.Pointcloud Raw) but I'm getting a red node at Writer AsyncRawWriter.v4p and I see you were able to make it work http://bit.ly/2f6rWzv
Could you please share the working file?
thanks in advanced!
sorry, i can not reproduce this problem.
did you checkout the latest version on github?
what does tty say?
Hi tmp,
I just downloaded the latest github version and Writer (DX11.Pointcloud Raw) is working perfect!
Thanks a lot! :-)
I'm trying to work with the Skeleton Node for Kinect2, however I don't really understand the values delivered (PositionXYZ & JointXYZ). Shouldn't the x and y values always lie between -1 and 1 or how do I need to interpret them?
it's in a world space, there was something to get joint's pixel coordinates but i think it's removed for some reason, anyways you should start new forum thread to get help
hello,
i'm not sure how to split the forcebuffer & ForceIndexBuffer to match the split of the pointcloudBuffer and keep the right forces at the right place, it's possible to filter PC id's at the Visualization stage but it's far slower!
any ideas?
thanks
Thank You, beautiful work!
Massive, cheers!
simply wow, all seems to be working, top level contrib
filterworld with kinect 2 seems to be doing nothing, emitter shows kinect image when connected directly, but shows nothing but pulsating help cube when connected to filterworld
Thank you for nice contributions and reaaallly appreciate help patchs :)
Is there a way to obtain a copy of a kinect emitter partciles that can be modified, let's say to produce a continuos trail ?
Actually from the helppatch is applied a Z force, it looks like it reset positions moving all particles away, without replaving them, how does it works ?
Thank you in advance.
wow! this is so cool
thanks
@StiX: thanks for the hint! it is fixed on github now
@hierro:
this great contrib makes my nvidia driver crash. beta35.2. x64 on win7 x64. please see report from tty. seems to be related to my system. all the other packs including particles pointclouds work fine. do i have to delete the latter? thank you.
Stacktrace
00:02:20 ERR : Exception caused by node during update :/77/193/40
00:02:20 ERR : SlimDX.Direct3D11.Direct3D11Exception in SlimDX: DXGI_ERROR_DEVICE_REMOVED: Hardware device removed. (-2005270523)
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei SlimDX.Result.Throw[T](Object dataKey, Object dataValue)
00:02:20 ERR : Exception caused by node during update :/77/132
00:02:20 ERR : SlimDX.Direct3D11.Direct3D11Exception in SlimDX: DXGI_ERROR_DEVICE_REMOVED: Hardware device removed. (-2005270523)
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei SlimDX.Result.Throw[T](Object dataKey, Object dataValue)
00:02:20 ERR : Exception caused by node during update :/77/126/101
00:02:20 ERR : SlimDX.Direct3D11.Direct3D11Exception in SlimDX: DXGI_ERROR_DEVICE_REMOVED: Hardware device removed. (-2005270523)
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei SlimDX.Result.Throw[T](Object dataKey, Object dataValue)
00:02:20 ERR : Exception caused by node during render :/77/118/187
00:02:20 ERR : System.NullReferenceException in DX11.Extensions: Der Objektverweis wurde nicht auf eine Objektinstanz festgelegt.
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei VVVV.DX11.Nodes.DX11MultiStructuredBufferRendererNode.Render(DX11RenderContext context)
00:02:20 ERR : Exception caused by node during update :/77/192/32
00:02:20 ERR : System.NullReferenceException in VVVV.DX11.Nodes: Der Objektverweis wurde nicht auf eine Objektinstanz festgelegt.
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei VVVV.DX11.Nodes.Geometry.NullIndirectDrawerNode.Update(DX11RenderContext context)
00:02:20 ERR : Exception caused by node during update :/77/134/7
00:02:20 ERR : SlimDX.Direct3D11.Direct3D11Exception in SlimDX: DXGI_ERROR_DEVICE_REMOVED: Hardware device removed. (-2005270523)
Stacktrace:
00:02:20 - : Stack Trace
00:02:20 - : bei SlimDX.Result.Throw[T](Object dataKey, Object dataValue)
00:02:20 ERR : Exception caused by node during render :/77/134/7
00:02:20 ERR : System.NullReferenceException in DX11.Extensions: Der Objektverweis wurde nicht auf eine Objektinstanz festgelegt.
Stacktrace:
00:02:20 - : Stack Trace
can you upload an example patch that throws this error?
or do you have these errors for all help patches?
did you download the correct architecture?
if you still have problems please use this page and provide as much details as possible:
https://github.com/letmp/dx11-particles/issues
thx for your fast reply! i am pretty sure, that i installed the proper architecture. girlpower does not work exept asymp geometry, but without colourfade. plugins work. modules all show the same symptoms. io boxes (string) need a long time to display content. then renderer goes black fullscreen for a second then. tty goes wild. system msg: nvidia driver disabled.
basically all the dynamic buffers make troubles. can do more analysis later. will post in issues. allthebest.
dunno if it helps, but i had to update my nvidia drivers.
dx11.particles requires the newest dx11, which was most likely compiled against a very recent driver
because of these strange driver errors i updated the nvidia driver. this issue disappeared indeed. nevertheless none of the modifiers of the pack work on my machine. it takes io boxes(string) quite some time to display content. meanwhile buffers receive 0, what they obviously dislike. more soon at the github/issues. thx!
@tmp
another thing, i think there are some problems with calibration and kinect2 as well, not sure the pick points line up with kinect pointcloud properly
thanks for the hint! will fix that asap (having no kinect2 atm)
Sorry but i haven't this nodes: GetThreadSize, Register its red =(
I use vvvv beta 35.
Thanks! =)
I'm having trouble using custom resource semantics (with a 3d buffer)
Render semantics seem fine
Patch here https://www.dropbox.com/s/dl3wgw5axhp119u/Vortex.zip?dl=1
(look in psys Vortex - nothing is happening when addressing the 3d buffer values and applying them to the force variable)
@isdzaurov : are you using the latest dx11-pack version?
Is it possible to use a different particles shader on different selections of particles in the same system?
yes -> have a look at the filterbuffer node help patch
regarding your previous question -> I will have a look at it on monday or tuesday
Hi there. Amazing contribution. I'm coming from the world of C4D and X-Particles so just getting to grips with vvvv. So much potential! My question is, is it possible to use this pack to load in pointcloud data such as .xyz .ply or .asc. Big thanks!
@StiX: please download the latest version from github. issues with pickpoint node are fixed now
@mrboni
you are using RenderSemantic(DX11 StructuredBuffer)
so you have to use
as Custom Semantic Entry
the other way would be to use RenderSemantic(DX11.Layer 3d) and keep
Hi guys check this post, you can find a texture based emitter.
https://discourse.vvvv.org/t/dx11-dynamicbuffer-select/14781/13
@tmp - ooops, missed that. Easily done using the dynamic modifiers!
Btw - awesome contribution :)
Also - What's the best way to draw splines following a particle's path? I'm currently trying -
but it's not quite right. Also I don't know how to set the spline count to only the number of alive particles..
@mrboni did you have a look at the example in girlpower? in short you can't use indirect dispatch in noodles (currently). But you can probably still achieve a given effect even with a statically sized buffer (eg scale them out when dying).
@everyoneishappy - got you. That should work for my needs. Thanksya
hallo hallo, got some problem with Color! Any ideas?
is the technique of the right color node "set"?
what happens if you set the technique of the left color node to "set" instead of "add"?
I did it but, SET, ADD, SUBTRACT and MUL , all gives same result... I just updater my graphic card, just in case (GeForce970), but nothing changed
trying the other help files, same problem:
also, I still have the "dx11-pointcloud" and looks like colors works
cannot confirm. here all techniques work as advertised
currently on a 770 gtx
@Luper you can contact me via skype or here: https://riot.im/app/#/room/#vvvv:matrix.org
@luper. got the same problems. things work in pointcloud pack, but not in dx11 particles pack. no colors, no modifiers. no errors.
could not track down the problem. i think it is related to my win7. what version do you use?
unfortunately we could not find a solution for the problem until now.
luper sent me his complete vvvv folder which worked perfectly on my pc. so I assume that the problem is somehow related to the installed driver(s) or his os.
ah btw.. luper also uses Windows 7 and has the same issue.
I tested it on Windows 8 -> everything is working.
so at the moment I can only recommend to update to win8 or higher.
ok it is definately a problem with windows 7.
it limits the count of bound rwstructuredbuffers to 8 and particles needs 9 at the moment. perhaps I can find a solution for that, but not in the next days. until then I can only recommend you to update windows to at least version 8 :)
ok it is definately a problem with windows 7.
it limits the count of bound rwstructuredbuffers to 8 and particles needs 9 at the moment. perhaps I can find a solution for that, but not in the next days. until then I can only recommend you to update windows to at least version 8 :)
my question is ... so lets say i have one particle system made from sprites, from that particle system i want to select a part with intersection, and emit from that selection but in phongpoint (so i guess into different particle system?) is something like that even possible? I thought all the names in systems are good for that but probably not? RWstructuredbuffer also operates inside one system i guess
/// AAAAh found filterbuffer in the conversation here, guess i would suggest to put it into selection category :3
yes, filterbuffer is the right node for that!
and yes - this node could be moved to selection category :)
kinect is heavily messing with my framerate / smoothness
without it i have around 50 fps stable movement
when i turn on kinect node alone, without plugging it into particles, everything is a bit choppy, windows+mainloop, the first parameter in perfmeter is going ham
without kinect:
with kinect:
i know this is probably dx11 thing, stil a bummer
can anyone else confirm that problem?
Hallo, I tried both vpm and github but nodes are missing, just to name a couple ScaleFade and ColorFade.
S.
Hallo, I tried both vpm and github but nodes are missing, just to name a couple ScaleFade and ColorFade.
S.
are they really missing, or are they emtpy?
which vvvv did you use it with?
I am asking, because there are still inconsistencies with https-linking of the dtd between alphas and betas.
is this something we should be aware of? if so, please elaborate!
I would say they are missing, not available on the descsribed path, using latest version.
problem confirmed with a fresh install.
I guess they are now replaced by ScaleByLifetime and ColorByLifetime, but without a diffff.xml this change breaks patches (including help patches)
Yes, ScaleFade and ColorFade were renamed in latest version on github.
They are called ScaleByLifetime and ColorByLifetime now.
Additionally there are new nodes called ScaleByDistance and ColorByDistance.
Sorry for the circumstances! The pack has a versioninfo and a diffff.xml now, so this cannot happen anymore ;)
How to control the number of emitting/max. particles? In the KinectSetup example patch, the amount of emitting particles can be increased by entering a lower number than the standard 100 in the Mod node (hope that’s the correct way). But with more particles, the emission stops in certain intervals. Seems that there is somewhere a restriction of the maximum number of particles. How to adjust/increased it?
Edit: Yeah, I think I understood now: The number for the Mod node is counting the points from the Kinect texture. 100 means that particles emitting from every 100th point. 1 means particle emitting from every point. There is one emitter who is fed with the resolution of the Kinect depth map, which defines the emitter size. Increasing this number extends the max number of particles.
Edit 2: Well, it seems that the performance drops down significantly when trying to make a more dense emission by emitting from every 5th point (also, I causes a kind of regular grid) . Could it be possible that multiple particles emitting from the same point and thus things getting slow? Maybe there is a way to restrict the emission my using filters on the Kinect textures? I tried with blur and masks, but it seems that the textures doesn’t work anymore when applying filters. Any suggestions?
Hi guest,
"Could it be possible that multiple particles emitting from the same point and thus things getting slow?"
"Maybe there is a way to restrict the emission my using filters on the Kinect textures?"
Hey guys
im getting some red nodes, coming from DX11.Extensions.dll
I try to load the DLL on the vvvv patch to see if find the correct path, but only loads the filepath. it doesn't open the dialog to select which node u want to use from that DLL
any hint?
using win 10 64, vvvv 50beta35 clean with dx11 pack and dx11 particles
Thanks tmp!
Count per particle is set to 1. Reducing the Kinect resolution seems to have the same effect as changing the value in the Mod node, causing a regular emission grid, that I actually would like to randomize a bit. Btw, is there a way to set the lifespan of the particles (in the dx11.pointcloud version too)? I am thinking of reducing the lifespan depending on the amount of Kinect players in order to retain smooth performance.
@manuel:
where did you get the particles pack from? latest github version? did you follow the installation instructions? did you download vvvv/dx11/dx11.particles in the same architecture? Please use TTY Renderer and paste error messages here.
@guest:
If you want to randomize it, you should use the mod-selector and play around with value + offset inputs.
There is no lifespan in dx11.pointcloud. So you should use the particles pack for this kind of behaviour. In general I recommend not to use dx11.pointcloud anymore ;)
@tmp
manual didnt work
now downloaded github version AND used .vpm so its working
notice when I tried the link of the vpm here, it wasnt freeze loading but never appeared to confirm download
Hi tmp,
adjusting the Mod "offset" doesn't change anything, even with different values on Mod "input". I tried a workaround by masking out the Kinect textures with noise, so that random pixels are black and not emitting. This works visually, but it seems that it doesn't affect the performance and the counted particles in relation to the max number. Even tried to mask out 80% on all Kinect textures, but the performance and particle dropout is similar to having no mask at all. So it seems that somehow invisible particles are around. Did a patch that illustrates the problem, but can't upload it here.
@tmp
This screenshot illustrates the grid emission when setting the Mod input to 39. Guess every 39th pixel of the Kinect texture is used for emission. I’m wondering if there is a way to achieve a more irregular emission. For example emitting from every point, but rather than at every frame all together, in random intervals for each point. That would already cause a more natural distribution without too much load.
you could write a custom selector that takes a dynamicbuffer with 0 / 1 values as input.
this should be an easy task if you have a look at the existing selection nodes.
@tmp
Writing a custom selector is beyond my knowledge. But the “Filter DX11.Pointcloud Texture 2d” works with the trick of using a noise texture to filter out some particles randomly. Is there an equivalent filter in the current pack?
Also, with the Pointcloud pack, a seamless emission from every Kinect texture point is possible. But in the new pack, when setting the Mod input to 1, it emits only from some parts every few seconds. A seamless emission is only possible by increasing the mod input, which decreases the emission grid. Increasing the emitter size leads to some more particles, but for the cost of a massive framerate drop. Tests are done with the Kinect Setup patch.
@tmp: Please help with some advise on the Kinect emission rate in the new dx11.particle version. I totally stuck when trying to emit from every Kinect pixel/point. As this worked fine in the old pointcloud version, it should be somehow possible with the current. Many thanks!
hey guest,
would you mind opening a new thread for your problem? this seems to be quite specific, tbh. and while you are at it, please upload any patches you have so far, so people have an easier handle on what you want to achieve, what you've tried and where exactly you failed. this contrib thread is a place for general discussion about the contrib.
anyway, from my understanding, all particle clouds from the kinect emitters are supposed to be very shortlived (as in one frame only). so if you want to have a longer-living "dust" from the body, you should create a secondary particle system (best with a different name) and emit from (selected and filtered, if you may) particles of the first particle system.
the reason for this is plain: as of now it is impossible with the pack to have a clean and solid id continuity between points of any depth cam. and that is the same as with the old pack, really.
hope that helps, otherwise please open a new thread
Hi velcrome,
Created a new thread here: https://discourse.vvvv.org/t/dx11-particles-kinect-emission-density/15101 with patches, if you want to have a look.
Hello and thanks for this amazing upload.
I have a problem, most of the nodes are red. I installed the package like mentioned in the "Second Way" and have the latest master build from github. Any ideas?
Picture with tty attached.
Best regards
neoshaman
do you have the addonpack installed?
Man, somehow my addonpack and my dx11 library was deleted after I was installing vpm. I had some wired errors after installation and didn`t realize that some libs are missing.
Thanks!
Hello again,
your particles are avvvvesome, but somehow I still have some red nodes. I've tested almost all help files and figured out that six nodes are red:
Constant(DX11.Particles.Effect) > ConstantGeometry
DynamicModifier(DX11.Particles.Modifiers) > Modifier_Skeleton
Scale(DX11.Particles.Modifiers) > Modifier_Scale
Target(DX11.Particles.Modifiers RWStructuredBuffer) >Modifier_Target_RWStructuredBuffer
Point(DX11.Particles.Effect) > ConstantPoint
Emitter(DX11.Particles.Emitter Layer) > BufferUtils_UpdateCounterBuffer
Do you know what to do?
Cheers
Hi,
I got a problem figuring out the kinect calibration (calibration(dx11.particles.kinect)help.v4p). I'm trying to calibrate on a xy-plane, but since the PickPoints are on xz-axis the calibration turns out wrong. I didn't find a way to rotate the PickPoints before calibration (which worked in pointcloud if I remember correctly). Is there any way to do this?
cheers
@neoshaman : are you using dx11 latest release ?
Newest version of vvvv and dx11 installed. Still same problems.
Hmmmm.....
for all users who don't want to download the latest version via vpm or github:
I uploaded the latest version as zip to this contributions page.
changelog highlights:
complete list here:
https://github.com/letmp/dx11-particles/commits/master
thanks man, seems like there are some assets missing in girlpower
what exactly do you miss? if you are talking about the chunk example, you have to follow the instructions to generate the assets. they are not included because they would blow the filesize of the pack.
Ok cheers, haven't found girl.obj from geom assimp so I thought some stuff from chunk example was mnissing. All good
hello, is there a way to feed this particles data to noodles? cheers!
@graphicuserinterface
Yep
check the girlpower folder there is a proper example
updated to latest vvvv & dx11 version
Hi, wow, I just finished the video of the workshop today. It's amazing and really fast to learn. Great!
Do you think it's possible to detect the intersection of two particle-systems.
I want to modify the particles of a given system with the particles of the kinect.
Question / Bugreport:
Emitter (DX11.Particles.Emitter RWStructuredBuffer) Help-Patch:
Why is first Emitter (EmitterDynBuffer1) affected by the "Force" connected to the "Selection" on the right? Seems the selection is not working properly. Would assume that only the Emitter RWStructuredBuffer-Particles are affected by that.
the particles of that emitter aren't affected by the force modifier.
you can verify that by grouping the force modifier with a color modifier :)
Check. Didn't see the spread in the z-input of the vector-node. Ok, now the whole pack is perfect. No bugs anymore. Anywhere. Chapeau. :)
Has anyone had a look at the vectorfield nodes? They don't seem to apply forces in the correct direction. Try these examples - http://www.mediafire.com/file/4x174fgf8lmec6g/VF_Examples.zip
@mrboni had this behaviour as well. as far as i remember you have to set the renderer inside the vectorfiledreader to r16g16b16a16float. should fix the bug.
thats right. sorry for that!
fixed in latest version on github!
FYI: I uploaded the latest version from github to the contribution page.
Changelog: https://github.com/letmp/dx11-particles/blob/master/CHANGELOG.md
small update for everyone who wants to use 3rd party shaders (like SuperPhysical)...
there is a new node called AsGeometry that outputs a single geometry. have a look at the help patch and have fun..
Thanks for the fix and update!
Thanks for the update
Awesome and thx for the update!
are the hitboxes a cpu or a gpu job?
while no visual representation needed, are they expected to work on a minipc with intel gpu ?
age modifier seems to be broken right now, doesnt do anything
@ggml
everything in this pack is a gpu job, since all calculations/modifications/selections are done with compute shaders. and yes. everything should work on intel gpu's.
@StiX
Cannot confirm. Everything working as expected here. I updated the helppatch of that node in the repository. You can download it here:
https://raw.githubusercontent.com/letmp/dx11-particles/master/packs/dx11.particles/nodes/modules/Modifiers/Age%20(DX11.Particles.Modifiers)%20help.v4p
There you should see in the bottom right that the age modifier is working.
bur why does it not affect their lifespan?
the way i would expect age and lifespan modifiers work:
if i have particle of lifespan of 1 and i set the age to 0.8 on every frame, they should never die
if i have particle of lifespan of 1 and set lifespan to 10, it should live 10 times longer
right now if i set the age modifier to 0.8, the particles with lifespan of 1 still get emitted every second
and if i set lifespan to 1, particle never dies
ok I updated the age and lifespan modifiers. from now on age and lifespan affect each other by default. there is a hidden pin to switch back to the old behaviour.
additionally I added a kill modifier.
everything downloadable on github. have fun :)
thanks a lot! we were really scratching our heads on this one :)
updated contribution downloads to latest version.
changelog:https://github.com/letmp/dx11-particles/blob/master/CHANGELOG.md
Do you think there's a way to get DX11.particles to do a physical fluid simulation?
At the low end something like this
https://www.youtube.com/watch?v=2WYc8zCyG-w
At the high end...
https://www.youtube.com/watch?v=RMEUuxU-Dso
I know it's not trivial but want to see if I'm theoretically barking up the right tree. Do you think a custom modifier would be capable of this kind of dynamic behaviour?
you need to look into porting https://developer.nvidia.com/flex
touch designer has it so it should be possible for v4
People have already done flex in vvvv afaik. But it's Nvidia stuff, so you need to be a registered dev and are under legal restrictions etc
Using Emitter(DX11.Particles.Emitter.Layer), how can I "disengage" from the positions given by the object? No modifier can affect the position of the particles as they stick to the object...
@io I know your comment was a few months ago but in case an answer is still useful.
You have to set a lifetime > 0 on the layer emitter. If you set it to 1 then it will read original positions from the layer every 1 second.
After that any other modifier can give the particles movement, eg the SelfRepulsion modifier. This is good for emitting trails of particles or making an object explode or slide away.
IF what you need is for the particles to always be stuck to the layer object than its a bit more complicated.
One approach is you keep lifetime at 0 so they reemit in the new positions every frame.
Then you could use a modifier in ADD mode to their POSITION property and you can create apparent movement.
Eg if you use the VF3D Noise and VF3D ParticleModifer nodes from the fieldtrip pack and then run some animation (integration) through the noise the layer object particles will appear to move.
In terms of a solution where you emit the particle and then it persistently sticks to the layer object but you can add other behaviours this is difficult because theres no trivial solution to targeting the new positions on the layer object. You can use the RW Buffer nodes to create and target particles based on the particles emitted by the layer Emitter BUT the particle count changes every frame and the layer emitter has no way to keep the order of the particles consistent from frame to frame. So whilst the particles do target positions on the object its effectively random which position they will run to so the particle-object cannot keep its shape and turns into noise.
Last potential solution is if your source for the layer emitter has velocity calculated already (Eg its a SDF+noise generated from VF3D noise from fieldtrip pack) then you could use that same velocity source as a modifier.
Hi all, is there any way to have multiple target for ScaleByDistance and/ or ColorByDistance ?
How to modify particle system "areas" by several vectors ? I was thinking about selection, but anyway cant have, in example, a smooth color transition since select is boolean.
Thank you in advance
I just uploaded an update for Scale/ColorByDistance nodes on github. They have spreadable inputs for position and radius now.
DL: https://github.com/letmp/dx11-particles/archive/master.zip
Hi, I did same changes to FX last night, now I found it in github, thank you very much :)
Tested and implemented, just wonderful :)
About ColorByDistance, can get working colors properly,let's say I have 4 positions, and 5 color for each of them, actuallty can't get it working properly in order to manage colorsets separately, did u try that ? Guess ColorBuffer should be indexed within loop.
p.s.: In ColorByDistance help patch binsize pin is set to 13.
you can achieve that with the help of selections:
It means its' mandatory to use selection in order to use colors binsize ?
In my test, i'm not using any selection and behaviour, in a way, it's not correct, as you can see i'm passing colors spreaded, but everything is only using the first bin.
If, I use the selection it works, but I don't need any selection :)
What i'm asking is if colors can be managed directly from CBD shader,as i was expecting to work.
Another solution I can apply, is to have a lerp inside shader loop and so manage a fadeToBlack with another incoming float buffer, not elegant but working perfectly.
I ended up with following code integration and it works as I need, using a fixed binSize for colors
do you think it makes sense ?
yes it makes sense in your application and I recommend to save it as a custom modifier.
but I think it doesn't make sense as part of the pack, since it breaks the default binsize/selection behaviour (if I get it right).
if you want, you can send me the custom modifier, so I can have a look at it. maybe it makes sense after all :) and if not, I could add it to the examples.
Hi, here the files usedhttps://we.tl/t-WOV2VuMbBN
I just added those lines, cos i needed to use this "mechanic" withouth selections (i gave it a fast try, was gettin to complex in order to color something faded and properly).
I understand what you mean with selection logic, maybe there could be a variant using binsize in a different way, but I don't have the package vision, that's why i asked "if it makes sense " :)
Thank you for help and support
Hi,
When i tried to use constant node i only get red node. I installed with first manually, after that i tried vpm too.
hi @ciziz just checked the latest beta, latest dx, latest particles, everytinhg seems fine...
it better if you do forum post... for now can only suggest go to packs folder, first check that particles called 'dx.particles' if so delete manualy and reinstall...
Hi! Is it possible to export particles to PLY format? (for further manipulation in Houdini, for example)
I have added custom attributes (in struct Particle) at my emitter, but the COMPOSITESTRUCT output of SetupShader is not updated, and my system does not work. Anything I have missed?
I can confirm a small bug.
You used tab and a lot of spaces before each of your custom attributes, which leads to a messy type/variable extraction.
There a two solutions:
1) Exchange the spaces with a tab
2) Put a Clean (String) node between ExtractStruct and Register node
Just as a report, if you're having issue using DX11 particle with Beta Preview, create packs file under vvvv_beta_38.2-4240_preview_x64 (whatever version you have), instead of putting in to the new packs directory.
Hey, Geometry Buffer seems broken on beta 39. Results in things like this https://i.imgur.com/OxbC90W.png , while exactly following the DX11 particles workshop.
I haven't set any colours, don't know where they came from.
Which is weird, beacuse the AssimpGeometry example works.
@wwrighter
I cannot confirm. Please open a thread for this problem and post your patch.
One hint that might solve your problem: You have to place the dx11.particles pack under vvvvdir/packs/
Hi, is there a way to blend transparent particles with opaque geometry?After a long time searching about depthbuffer and softparticles still nothing comes in handy
@tellelasmy
it's better to ask this on forums, you can use Blend mode advanced with AlphaToCoverage checked on