STM32 External LED blink - embedded

I am trying to Blink external LED using STM32CubeIDE and Proteus
while (1)
{
HAL_GPIO_TogglePin(LED_GPIO_Port, LED_Pin);
HAL_Delay(100);
}
[)2
The LED doesn't blink

I am assuming that you have configured the pin to be output correctly with no pull up or pull down resistor. Hence you will need to terminate the LED into the ground instead of 3.3V.
In the event there is no pull up or pull down resistor, the pin is neither high or low, the pin is in a "z-state" hence when you toggle the LED from low to high there is no potential difference between the pin and the LED so no current flows, and when the pin goes from High to low, the diode property (only allows current flow in one direction) of the LED doesn't let current flow into towards the board.

The Problem Solved.
First the polarity of the LED.
Second the library of the blue pill on proteus doesn't support STM32FC103C8 that I selected on STM32CubeIDE. STM32FC103C6 should be selected instead.

Related

PyDirectInput Mouse Movement Is Very Sensitive

I have been playing around with pyautogui before switching to pydirectinput in order to automate things in Minecraft. I'm making a mining bot and I'm running into some issues involving automated mouse movement in the game. I'm using the moveRel() function, although I have used move() and moveTo(), they produced the same result as moveRel(), to move the player's head up and down. However, even when I put the Offsets to a really low amount like 1, the player's head rotates in a full range of motion. To help you visualize this, in Minecraft, picture your character staring off into the horizon. Now imagine what would happen if you suddenly jerked the mouse back. The player would face down right? Well, every time I try moving the mouse a little bit using pydirectinput, the player always ends up facing down. What is causing the player to look down as if its camera were anchored when I use the mouse moving function in pydirectinput?
I Solved My Problem. It turns out that I need to turn on Raw Input so that the mouse wouldn't accelerate so much. Raw Input uses the raw mouse movement from your computer meaning that it does not accelerate or deaccelerate the mouse input to match the game sensitivity. I think that's how it works. By the way, raw input is in the mouse control settings in Minecraft. Anyway, because of the acceleration of my mouse input, the simulated mouse movement from my pydirectinput script was too sensitive for the game, so that's why the player always looked downwards no matter what numbers I put into the moveRel() function.

Control for dimming lights with programming language

I'm planning on showing my sound work for a show, I'm just wondering if it's possible to control the lights getting darker and brighter slowly?
It starts from the pitch black at the beginning, and getting brighter and darker, and it turned back to pitch black at the end of sound.
I have no experiences about it.
If you're using the Hue API to change the state of a light or of a group of lights (links require you create a free Hue developer account to access), you can set a transitiontime property. This will cause the light to smoothly transition from its current state to the chosen state over that time period. This way you'd only need to send commands to the lights when you want them to start a new transition.
Note however that you will have trouble doing a transition from complete darkness: the lowest brightness for Hue bulbs is nowhere near pitch black, so you'd notice the jump from "off" to "brightness 1".
There is also a second Entertainment API that supports streaming light changes (i.e. up to about 10 times a second) rather than relying on transitions. This is somewhat more involved though.

LCD panel interfacing with beaglebone Black

i am trying to interface a cheap LCD panel to BBB
so basically i am making my own LCD7 cape but without the EPROM & I2C stuff
and till now i have succesfully wrote a device tree overlay , loaded it, and fried a LCD panel well ...without any smoke.
the problem is after checking the LCD7 made by circuitco i noted this IC between the beagle and the LCD :
74AVC32T245
i dont really understand why its there
here is the opensource design of LCD7 cape the transducer is at page 21
http://www.openhacks.com/uploadsproductos/beaglebone-lcd7-reva2-srm.pdf
any help regarding out to interface LCD panels is very appresiated
on page 20 of that document, section 5.2.2: Non-Inverting Bus Transceiver explains everything. The chip is meant for voltage level translation, just in case the LCD and the MCU operate at different levels. But in the BeagleBone LCD7 Cape, no translation is required. So its just a buffer, I don't think it should matter in the code implementation. It does say "its two power rails are both 3.3V" So you should observe that.

Unity3d external camera frame rate

I am working on a live augmented reality application. So far I have worked on many AR-Applications for mobile devices.
Now I have to get the video signal from a Panasonic P2. The camera is an European version. I catch the signal with a AJA io HD Box, witch is connected by firewire to a MacPro. So far everything works great - just not in Unity.
When I start the preview in Unity the framebuffer of the AJA ControlPanel jumps to a frame-rate of 59.94 fps. I guess because of a preference on unity. Because of the European version of the camera I can not switch to 59,94fps or 29,47fps.
I checked all settings in Unity, but couldn't find anything...
Is there any possibility to change the frame-rate unity captures from an external camera?
If you're polling the camera from Unity's Update() function then you will be under the influence of Vsync, which limits frame processing to 60 FPS.
You can switch off Vsync by going to Edit > Project Settings > Quality and then setting the option Vsync Count to "don't sync".

Extending Functionality of Magic Mouse: Do I Need a kext?

I recently purchased a Magic Mouse. It is fantastic and full of potential. Unfortunately, it is seriously hindered by the software support. I want to fix that. I have done quite a lot of research and these are my findings regarding the event chain thus far:
The Magic Mouse sends full multitouch events to the system.
Multitouch events are processed in the MultitouchSupport.framework (Carbon)
The events are interpreted in the framework and sent up to the system as normal events
When you scroll with one finger it sends actual scroll wheel events.
When you swipe with two fingers it sends a swipe event.
No NSTouch events are sent up to the system. You cannot use the NSTouch API to interact with the mouse.
After I discovered all of the above, I diassembled the MultitouchSupport.framework file and, with some googling, figured out how to insert a callback of my own into the chain so I would receive the raw touch event data. If you enumerate the list of devices, you can attach for each device (trackpad and mouse). This finding would enable us to create a framework for using multitouch on the mouse, but only in a single application. See my post here: Raw Multitouch Tracking.
I want to add new functionality to the mouse across the entire system, not just a single app.
In an attempt to do so, I figured out how to use Event Taps to see if the lowest level event tap would allow me to get the raw data, interpret it, and send up my own events in its place. Unfortunately, this is not the case. The event tap, even at the HID level, is still a step above where the input is being interpreted in MultitouchSupport.framework.
See my event tap attempt here: Event Tap - Attempt Raw Multitouch.
An interesting side note: when a multitouch event is received, such as a swipe, the default case is hit and prints out an event number of 29. The header shows 28 as being the max.
On to my question, now that you have all the information and have seen what I have tried: what would be the best approach to extending the functionality of the Magic Mouse? I know I need to insert something at a low enough level to get the input before it is processed and predefined events are dispatched. So, to boil it down to single sentence questions:
Is there some way to override the default callbacks used in MultitouchSupport.framework?
Do I need to write a kext and handle all the incoming data myself?
Is it possible to write a kext that sits on top of the kext that is handling the input now, and filters it after that kext has done all the hard work?
My first goal is to be able to dispatch a middle button click event if there are two fingers on the device when you click. Obviously there is far, far more that could be done, but this seems like a good thing to shoot for, for now.
Thanks in advance!
-Sastira
How does what is happening in MultitouchSupport.framework differ between the Magic Mouse and a glass trackpad? If it is based on IOKit device properties, I suspect you will need a KEXT that emulates a trackpad but actually communicates with the mouse. Apple have some documentation on Darwin kernel programming and kernel extensions specifically:
About Kernel Extensions
Introduction to I/O Kit Device Driver Design Guidelines
Kernel Programming Guide
(Personally, I'd love something that enabled pinch magnification and more swipe/button gestures; as it is, the Magic Mouse is a functional downgrade from the Mighty Mouse's four buttons and [albeit ever-clogging] 2D scroll wheel. Update: last year I wrote Sesamouse to do just that, and it does NOT need a kext (just a week or two staring at hex dumps :-) See my other answer for the deets and source code.)
Sorry I forgot to update this answer, but I ended up figuring out how to inject multitouch and gesture events into the system from userland via Quartz Event Services. I'm not sure how well it survived the Lion update, but you can check out the underlying source code at https://github.com/calftrail/Touch
It requires two hacks: using the private Multitouch framework to get the device input, and injecting undocumented CGEvent structures into Quartz Event Services. It was incredibly fun to figure out how to pull it off, but these days I recommend just buying a Magic Trackpad :-P
I've implemented a proof-of-concept of userspace customizable multi-touch events wrapper.
You can read about it here: http://aladino.dmi.unict.it/?a=multitouch (see in WaybackMachine)
--
all the best
If you get to that point, you may want to consider the middle click being three fingers on the mouse instead of two. I've thought about this middle click issue with the magic mouse and I notice that I often leave my 2nd finger on the mouse even though I am only pressing for a left click. So a "2 finger" click might be mistaken for a single left click, and it would also require the user more effort in always having to keep the 2nd finger off the mouse. Therefor if it's possible to detect, three fingers would cause less confusion and headaches. I wonder where the first "middle button click" solution will come from, as I am anxious for my middle click Expose feature to return :) Best of luck.