I'm looking for an event that fires as a node is being dragged. I thought that tapdrag looked like what I wanted, but tapdrag fires whenever the mouse moves over the node only when the mouse button is not down (so the node's not being dragged, basically mouseover). I've tried several other events, but none seem to do what I need.
How can I fire an event as a node is being dragged? Specifically, I want to move another node while the first is being dragged, which I'd rather do without using a compound node.
Use the drag event.
The events are all listed and documented: http://js.cytoscape.org/#events/collection-events
You can manually .position() on the second node when the drag happens on the first node.
To set rules for node placement, like moving two nodes in lockstep, use the automove extension:
https://github.com/cytoscape/cytoscape.js-automove
Related
For situations where a parent node covers most of the screen, how can left-clicking and moving the mouse enable panning and not grabbing?
When I set grabbing to false for the parent node, and I left-click on the parent node and move the mouse, nothing happens.
Edit: I would like to apply the equivalent of panify() but ideally I can do it when declaring the node.
I just needed to set pannable to true when defining the node.
I have two shape at same position, not same color, and when i click over them, i want to fire click event on both, not just the first.
These two shapes are in the same container.
I have tried getObjectsUnderPoint() under stage.on("mousemove"), but this function increase my FPS (60 to 32~, and inside there are just a console.log), So it's not a good solution.
I tried the bubble, the useCapture, but i think it isn't what i want.
I just want to fire click on all element behind my mouse.
If someone have a solution, please.
There are a few points here:
EaselJS objects sort of work like HTML DOM Elements. If an element has a mouse handler on it it will block objects in other hierarchies below it from receiving the event. This is just how it works. The main difference is that if you don't have mouse events, then they are just ignored (like if you set the pointerEvents on an HTML element to none to have the mouse ignore it).
Your idea with getObjectsUnderPoint is what I would have recommended. Running any hit-test logic on mousemove is going to be expensive (mousemove fires a LOT, which is why we give the ability to throttle the typical mouseover check if you enableMouseOver on the stage).
A few things to optimize it:
Ensure you are using mode=2 on your getObjectsUnderPoint, which will only check objects with mouse listeners. This greatly reduces the overhead in checking. [docs]
Instead of checking the stage on your own mousemove event, set a flag to check it on tick. This throttles it to your Ticker FPS at least. You could also set an interval to check it even less often. It can create a little bit of a visible lag, but you can probably find a nice balance.
Reduce what is checked. If you have a container with your clickable elements in it, you can only check that object for mouse interaction.
If your objects are the exact same, you can also just trigger the click manually on the second item.
obj1.on("click", function(event) {
obj2.dispatchEvent(event);
});
I hope those approaches provide you a solution. Let me know if you have follow-up questions.
Cheers,
How to track/listen active dragging event of UI Element in UWP. I added the flat CanDrag=true to enable the drag feature. Using the DragStarting and DragLeave can identify the start and end of drag. But want to get the X,Y coordinate of the UIElement on dragging. Any recommendation to achieve that? I dont want to use RenderTransform for this.
I don't think it's possible in standard Drag'n'Drop flow.
There're few options but they look like a crutch.
You may use manipulations and subscribe to Delta event. That's where RenderTransform comes in.
You may start some time loop and get visual position each frame. (The most crutchy idea)
Try to play wih IsHitTestVisible and subscribe to PointMove of some FrameworkElement behind.
Subscribe to PointEntered and PointExited to the elements you dragging over so they could fire events.
I prefer the first one. It's simple and it's plane.
In wxWidgets, I would like to have 2 objects on a wxPanel joined by a line and with a mouse down event to relocate the position of either of the 2 objects (The line should automatically be redrawn to follow the new position of the objects). I have tried using wxPaintDC to draw the initial position of the 2 objects (Custom created with mouse click events) and using dc.DrawLine to join these 2 objects together with a line. How should I proceed to allow make those 2 objects movable (mouse down event) along with the line? Can this even be achieve?
Of course it's possible (just about everything is, it's "only" a matter of effort), but it's not completely trivial and you may want to use OGL which does it for you. On the flip side, OGL is very old and completely unmaintained since a long time, so if your needs are really simple, it's probably still better to do it yourself.
If you do, here are some hints:
For mouse handling, you first need an (efficient if possible) hit testing function.
Once you have it, it's easy to detect where a click happens and what object is under it and start moving it if it makes sense. To be more user friendly, the move shouldn't start immediately, but wait until either the mouse moves a little, or stays pressed for some longer time (this will avoid accidental moves).
When the user is dragging the mouse you will get wxEVT_MOTION events. Use wxMouseEvent::Dragging() in your handler to check whether any mouse button is pressed.
Whatever you do, you must always draw from your wxEVT_PAINT handler, i.e. if an object moves, you must not redraw it immediately, but update its position, invalidate the area taken by it before and after the move (invalidating everything is simpler, but less efficient) and draw it from your OnPaint().
You want double buffering to avoid flicker, see wxAutoBufferedPaintDC
I am using Dojo's touch.press() and touch.over() combination for some drag action.
While debugging desktop Firefox, touch.press() event is a mousedown and touch.over() is a mouseover, everything works fine and the properties I need are available (pageX, etc).
Then I remote debug a mobile Chrome and what I get is:
touch.press (GOOD):
a TouchEvent which contains type="touchstart", pageX properties, and the touches array with one position (one finger) which itself has the expected properties (pageX, ...).
touch.over (BAD):
a Event which contains type="dojotouchover", but has no pageX or the touches array.
So the question is, is this a bug, should I do something different?
I suppose you're listening to touch.over events rather than touch.move events ? As far as I understand, the synthetic event that a touch.over handler receives on touch devices does not includes all the informations from the original touchmove events that triggered it.
If you want to retrieve all the informations from the original event, I think you might actually listen to touch.move events.