Can I use the API to update the positions of components in an assembly to get real time motion?
I am wanting to simulate a 6 DOF robotic arm where I would be streaming the joint angles from a script. Can I do this in SolidWorks?
Can I do this in SolidWorks
I don't know. I have no means by which to evaluate your abilities, therefore I cannot make such a judgement.
Is it possible in SolidWorks?
Probably not directly.
Since you are talking about driving the "sim" via joint angles, you would have to create the assembly with appropriate mates/constraints such that you drive the dimensions related to the angle mates.
It is possible to drive dimensions via API. However, it wouldn't be "realtime" because SW would have to solve the mates with each change of dimensions.
The only way to make it "realtime" would be to have your own time/joint angle tables and take a screenshot/image/whatever after each solve and put them together into an animation yourself.
I am more interested to know if something along the lines of what I am trying to do is feasible in SolidWorks. I don't currently know the API, but if I could determine if I can accomplish my goal then learning the API would be my next step. I know python, C#, C++, and Java, so I feel confident in being able to learn the API.
I have been designing a robotic process lately with one or two Universal Robots, and I'm running into performance issues and frame rate issues with a third party simulation tool. The issues stems from the fact that UR only provides a simulator of their controller and not the surrounding world objects involved in a process. We picked up a third party tool to simulate the surrounding world objects, and this tool doesn't seem to have a strong graphics engine. For example, a large assembly that works fine in SolidWorks will bring this tool to a crawl. Unfortunately, that's not the biggest issue, which is that any line of code to its API can take over 100 milliseconds for a response, even with C++, which does not provide for a good simulation.
Fortunately, the UR tool is robust, and I can stream joint data with a python or C# script over regular time intervals. I would be satisfied with having to spend up to like 2 hours for a simulation to compile that produces individual frames that I could later combine into a 1 or 2 minute video that shows seamless motion.
So I was able to drive movement by changing the mate dimension values (See: How to get/retrieve an assembly mate using it's name? ), and it was quickly apparent that roughly 90 milliseconds were needed to rebuild the scene. Like Josh was saying, this wouldn't be 'real-time', but taking screenshots for each time step of motion could allow for a video to be compiled which shows the smooth motion.
That being said, the presentation transforms were very fast since they didn't require a rebuild. I'm not sure how well this would work with a multi-part assembly, such as the robotic arm. I'm thinking I would probably need to calculate the transforms for each component of the arm.
John Courtney wrote: So I was able to drive movement by changing the mate dimension values (See: How to get/retrieve an assembly mate using it's name? ), and it was quickly apparent that roughly 90 milliseconds were needed to rebuild the scene. Like Josh was saying, this wouldn't be 'real-time', but taking screenshots for each time step of motion could allow for a video to be compiled which shows the smooth motion. That being said, the presentation transforms were very fast since they didn't require a rebuild. I'm not sure how well this would work with a multi-part assembly, such as the robotic arm. I'm thinking I would probably need to calculate the transforms for each component of the arm.
John Courtney wrote:
Denavit-Hartenberg ring any bells?
That is what I was referring to, yes. I'm not yet sure what frame rate I'll be able to achieve with using the DH calculation approach, so in the short term, I think I'll compile an animation.
I believe I saw a developer controlling motion studies with an XBox controller in real-time at a conference a few years back, so that leads me to believe it can be done with the API. Looking briefly through the API help might be able to make it easier to assess whether or not you'll have access to the objects you'll need to make this happen. Looks like we can control motors with it, so what else do you need?
I think being able to set the position of the motors, such as the motors in each joint of a robotic arm, would be great. I was playing around with presentation transforms to move the graphical element of an object using the C++/CLI API, so that seems like another option. My ultimate goal is to create a simulation where a robotic arm picks and places an object where the motions of the robotic is driven from the actual robot controller (or at least exported csv file of joint positions over time).
Before Presentation Transform:
After Presentation Transform:
I have yet to figure out how to actually pick up the object. One option seems to be to create temporary mates of the object origin planes and planes located on my gripper tool. Another options seems to be to calculate the position and orientation of the object for each time step during motion, but this seems to be calculation intensive. In regards to doing a physical simulation, I've only seen objects get picked up by having some kind of flange on the gripper that is beneath the part. I'm not sure if simply "squeezing" the part is enough to pick an object up.
Maybe you've already moved passed this, but "just squeezing" the part might work, but would like require some tuning. SOLIDWORKS Motion handles friction with a static and kinematic coefficient, but due to relative velocities never being perfectly zero in a numerical tool, a tolerance velocity is used for the static coefficient. A graph of this behavior is shown in the SOLIDWORKS help.
Based on your later replies, it sounds like you are perhaps just using Animator KEYS and perhaps the MOVE command (perhaps MOVE WITH TRIAD) to get what you are looking for.
Retrieving data ...