Multiple Services and Main_Thread communicating Kotlin - kotlin

Converting code from iOS Swift to Android Kotlin has not been fun for this old dinosaur mainframe Fortran guy.
I’ve learned about Services, foreground and others, Coroutines and channels, Broadcast Receivers, Run blocking, etc. I’ve got samples running for many different techniques and designs. I’ve learned a lot; and have merged a lot in my app.
I’m running into most difficulties in getting the pieces talking to each other and even debugging what I’m trying.
I’ve learned so much on this site by searching specific questions. I’ve learned things not to do on this subject, like not more than one foreground service, avoid threads, etc. So I thought this time I would start high level and not specific how to’s.
Your help in best way, as title states ... What do you have to offer to allow me to accomplish in Kotlin..
“Multiple Services and Main_Thread communicating”
Tips, caution, and understanding I’m an old dog learning new tricks, will be greatly appreciated.

Related

Is it possible to develop with MassTransit in VB.NET?

I'm a casual developer and typically use VB.Net when writing solutions, as its what I'm most familiar with. Though I aim to learn other languages in the future, such as C#, its very much on the back-burner and so here we are.
One of my current projects would benefit greatly from a service-bus architecture, so I have been researching potential options. Some of the more obvious choices, such as Azure Service Bus, are out of the question, as my application will not have access to the public internet once operational. Looking in to options that could be run within a local site I've stumbled across both nServiceBus & MassTransit as front runners.
My current stumbling block is that all the documentation and sample code seems to be written for C#. My question would be if the frameworks require programming to take place in C# exclusively, or whether I can utilise another .NET language (such as VB) instead. If this is indeed possible, I wonder if anyone could point me in the direction of some sample code? Of course I can gradually reverse engineer the C# if necessary, once I know if VB development is even possible (or practical).
Thanks,
Chris.

ipad programming guidance

I'm just at the startup level in ipad/iphone programming. There is a project in my mind, but I need some guidance on key points:
fundemental requirements:
1)user interface and interaction like wired magazine app(playing movies on page,etc)
2)accessing the timeuser spent on pages and videos
more:
-accessing another application's data (and let's say that application can give permission, if there is so)
Maybe these are just easy things to figure out, but if you could point me where to look I would be pleased.
PS: I have more than 10+ top seller ebooks on ipad/ios/iphone programming and I'm started to reading them. To be clear, names of these concepts (in what way I should research accessing the time user spent on a page- I did try on google on my own words but could not get the desired result)or some material pointing the issues really ease my way.
The UI portion of your question should be pretty easy to get to, it will just take learning a lot of the Cocoa Touch library, probably even a bit of Core Foundation.
There are two books I highly recommend:
Programming iOS 4
iPhone Programming: Big Nerd Ranch
I strongly recommend Programming iOS 4, primarily because it has been updated for XCode 4.
To answer your other question: to the best of my knowledge, you are not allowed to access other application's data due to sandboxing. You can, however, share data between your own apps if they share the same App ID.
First, as to the question on sharing data between apps. You can pass data between apps - basically launching one app from another and passing arguments. This can go both ways. If this is what you want, I'll share more on that.
As to learning, I recommend devouring Apple's documents, their samples, and Stackoverflow. Most of the iPhone development books you come across on development will be useful.

How much time do I need to learn LabVIEW [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I know that this question is too abstract. But. How much time do i need to learn LabVIEW to become average LabVIEW developer? For example, if I buy good book about LabVIEW and have 8 hours per day (on my work) dedicated to LabVIEW learning how many days i will spend on LabVIEW learning? Could you please provide example from your own experience. More information about me that can be helpful: I'm a developer and know c\c++\python and a little bit of java languages.
Like Swinders said, it might depend a lot on your sensibilities. I have seen people who had a really hard time migrating to the data flow concept. It's a different paradigm from the classic text-based languages and some people can't easily think in these concepts.
If you get past that hurdle, you'll find that the IDE handles a lot of the annoying things you used to take care of for you (things such as syntax and memory allocation). This allows you to become productive very quickly.
It doesn't mean, however, that your level would be high. One potential pit you should try hard to avoid is casting your existing experience onto LV. The most common example is probably local variables. This may be shocking to people coming from a text-based world, but LV does not have variables, per-se. Unfortunately, it does have elements called variables and people migrating from C who find them jump on them and use them as they would use variables in C, leading to LV code which looks like C code and is bad code (at least in LV).
If you do manage to work around this, I would guess you would become better than the global average in less than a month and better than most professional developers after creating three projects you would later look at and say "what the hell was I thinking?".
I never took any of the NI courses (although I understand some of the advanced architecture ones are pretty good), but I would suggest you also spend some time in some of the online communities (such as LAVA or the NI forums) and look at some of the examples and discussions there. There's a lot of material about best practices, design patterns, etc., which would allow you to become a more professional developer.
Above all, do not abandon your current professional conduct. If you have a structured process for designing and developing software, you already have a leg up on the majority of LV programmers. Just make sure you adapt and keep using such a process.
I started with no commercial programming experience (I have always programmed for fun) and followed an on-line tutorial to pick up the basics of LabVIEW. Within a week I was able to understand existing code and could develop a small application.
It is hard to give an estimate on how long it would take to become an 'average' LabVIEW developer as this depends on what you mean by 'average'. One thing to consider is how easy you are able to think in terms of data flow rather than procedural languages. If you can pick up new programming languages quickly then this will help.
Would you be the only person using LabVIEW or are there others at your place of work that could mentor you? You may also find that there are user groups operating near you which I would recommend (check the NI website or contact your local NI office).
There is then the experience that you will need to gain to allow you to produce good LabVIEW code. I was lucky to be able to attend the National Instruments training courses a few years ago which I think helped me but only by using it have I become an 'average' LabVIEW developer.
I'd say a few weeks or most with devoting the majority of your work time to it. I had a similar background to you when I started to develop in LabVIEW. The hardest part was adapting to the lack of variables. There are local variables, but it's not what you're used to at all. Additionally, their functions, called Virtual Instruments (VIs) can have multiple inputs and outputs, similar to how Python can handle n-tuples.
I will warn you, their array handling features are terrible. A lot of general concepts you might be used to are difficult to implement. My mantra when working with the language is it makes hard things easy and easy things hard. There are also a lot of "gotchas" in the language set, especially with their DAQmx function. I'm not sure what you're planning on developing and their Real-Time module has it's own issues as well, different issues from the main language set.
I would definitely spend some time on NI's website and read as many whitepapers as you can, especially about good design practices, here and here. Learn their State Machine (here or here) and Producer/Consumer pattern well, that's the backbone of many applications you'll be writing.
Good luck, it will make your head spin for a while.
There are some excellent resources to help you get started. If your employer can afford training, you can get started pretty quickly by taking a week of training run by National Instruments. The NI website also has an outstanding developer community that is highly responsive to questions even from novice developers. But I would say that the key to being comfortable with the idioms and style of the language is just plain old practice that you get by solving problems using LabVIEW on a regular basis.
You will find eventually that there is the question of hardware and instruments. Labview is really all about data acquisition-- either through NI's DAQ hardware or through traditional GPIB instruments, or through 3rd party api's (activeX, .NET assemblies). If you're using LabVIEW, you're probably interfacing to hardware of some type. This can get really challenging with complex instruments and measurements. If you're getting started, I would recommend making sure that you have unlimited access to at least some of the hardware you'll be working with. In other words, make sure that your manager understands that you need a lot of access to the hardware in order to get good at developing with it.
We are using LabVIEW to create test software for our factory test systems. In the past years I have already trained some beginners to understand LabVIEW. I would say it depends on how good you are at learning new concepts. I have trained some to be able to produce standalone applications using the queued message handler concept, doing dynamic GUIs and using hardware drivers within about 3 weeks. Unfortunately there were others aswell that were only able to learn half of that within half a year.
The most important thing in my opinion is the learning source. Having an experienced LabVIEW user that can guide you is the best option. If there is no one not available I would recommend Youtube Tutorials combined with the shipped LabVIEW examples.
The LabVIEW core tutorials are not very handy in my opinion. Those are quite boring and far from what you really need to get started.

Do you think you need some simple tutorials on Microcontroller programming?

This is not 100% programming related. But I think this is somewhat useful because it is addressing a minority in the SO community.
Microcontroller programming is one of the interesting areas in programming. I saw some topic here requesting the Resources for starting / learning / discussing about PICs.
Example topic
Since I have plenty of knowledge and experiences in this area I am thinking of publishing some resources that helps a novice to learn them from the basics. It will be not just a theoretical publication and will be based on example projects. I hope to start this over a new blog + forum so the users can dynamically interact with each other. I came in to this decision because I found very small amount of Sites that a novice can start learning and work collaboratively.
What do you guys think about this? Have you ever experienced such difficulty? Do you think you can get some use of that? What are the things you like to see on the site?
I would be thankful If you are not going to close this as NPR. I just want to do some service to other microcontroller lovers :)
There are already a few such tutorials on the net (e.g. this one from SparkFun), another one might be a valuable addition, but only if it is better or different in some way.
What will you offer that is a real improvement?
Some suggestions:
Don't assume I have windows
Have some side discussion of difference between various MCU and/or supporting electronics. Discuss some of the trade offs
You'll need a pretty general tutorial to suck people in, but the real value added might be in a specialized focus after the start
Build up to something useful and/or geeky cool
A unit on component integration (i.e. I can buy a Polar style heart rate receiver, and a MCU and a USB interface. How do I get them talking to each other so I can build an exercise data logger?)
What every you do, I'm looking forward to it (just learning embedded stuff in my spare time...).
There are the excellent tutorials at www.mikrocontroller.net, but they are in German.
If you could create something similar for an English speaking community, that would be great.
Yes! The more resources out there for helping with embedded software (microcontroller programming) the better.
It can be quite daunting to start with, especially if you've only written software for PCs or similar in the past. There are lot more constraints (e.g. on RAM and code space), and a whole load of things you need to know that don't apply to non-embedded software.
As others have mentioned here, there a number of websites that cover different aspects of this; some others are OnARM, for ARM processors, the related STM32 Circle, and Jack Ganssle's articles on his website and on Embedded.com.
Though embedded systems are an enormous market (just think how many such devices there are in your house, or in your car), my impression is that there is a lot less coverage of the subject on the web - and on Stack Overflow - than for non-embedded.
So, I look forward to seeing the fruits of your labour!
Something else that's worth to take into account when targeting beginners, is to directly provide pointers to useful resources, such as suitable simulators/emulators, or even addresses/webpages where you can easily order a starter kit or even free samples of some chips.
For example, most semiconductor manufacturers provide free samples of their products, e.g. see microchip.com or atmel.com.
Ideally, an introductory course would be based on working with such a hardware simulator or emulator in the beginning, so that the project and all relevant experience may directly map onto a real device once the beginner is interested in moving his work onto a real chip, providing pointers to freely available resources, or very affordable starter kits can be very useful.
This would ensure that beginners can get started as easily and cheaply as possible.
Maybe for the different ARM7 and CortexM3...?
Here everyone asumes there is a lot of information, but it is spread all over the net and without any red line what so ever...
But if you take AVR there is quite a lot of stuff over at http://www.avrfreaks.net, and I guess that PIC has quite a lot as well.
I have written many such examples myself but they are scattered and not organised and probably rarely read (one time the folks at avrfreaks borrowed something). StackOverflow might curb this but SO could in theory be used. Ask a question about boot code for an arm whatsit, then answer your own question with example code and text on how and why it works. The SO tags would be nice in that you could do a search on "boot" "arm" "embedded" and then one on "boot" "avr" "embedded", etc and get similar example programs for different platforms.
Personally I would go more in the direction of creating an example archive of complete programs for specific microcontroller versions (in typical uses), instead of making yet another "general" tutorial. E.g. one of microcontroller x/y that enables a serial port, one that configures a few digital outputs (setting TRIS and friends), how to set up common frequency/oscillator options etc.
When I started with PIC, (very short PIC16, then PIC18 then 24F and now dspic), one of the main problems is that all the examples are either only fragments or describing very general principles.
A tutorial is no good, if it takes more skills to get the examples actually working than the tutorial teaches.
I usually couldn't find one single complete program for exactly my controller, or even for the slightly wider group (that only vary in number of pins and memory/flash).
The initial program was always the problem, but sometimes later I had the same problem (initializing a certain peripheral) all over again (e.g. the encoder) It is specially frustrating if is the first run of a new micro controller line, and you might not be 100% sure of your hardware.
Unfortunately that takes some coordination, from a forum, an user group or so, since nobody has all devices, and all variants to wire them up (e.g. different oscillator options).

Understanding WCF

Could anyone point me to a resource that explains WCF with pictures and simple code snippets. I am tired of googling and finding the same "ABC" articles in all search results.
WCF is a very complex technology that in my opinion is very poorly documented. It is incredibly easy to get up and running with, but the performance tuning to run a large scale app can be incredibly complicated and a lot of trial and error. One day everything is working fine and then you find out that only a single Channel is kept waiting for a new connection and that there is a config setting that you need to adjust on a custom binding to allow more channels to be waiting so that calls don't fail inbetween when a channel is used and the next channel is spun up.
In general Nicholas Allen's blog is a gold mine of information. However Windbg has been my best friend in trying to explain some very bizarre behavior coming from WCF.
Here's a really simple example. It's specific to CE/Mobile devices, but the concept is no different PC to PC.
I found the following two books to be really good for getting up to speed on WCF:
Programming WCF Services (Lowy - O'Reilly)
Pro WCF (Peiris, Mulder - Apress)
They both start with more of a conceptual description of WCF, so you understand the concepts and terms. This is really useful, because it allows you to narrow any google searches to more specific concepts.
And this is an article that breaks down understanding WCF and why it was developed in a simple, bulleted list.