How can I prove that my application is safe? [closed] - malware

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I wanted to ask you, experienced programmers, a question that bothers me recently. I'm a second year student of the university of technology, where we spend a lot of time learning how to code. I found creating small but practical applications the best way to learn, and sometimes I would like to give them for free to someone else. And here the problem appears. If someone want to use it but is afraid that the app is not safe, I don't know any other way how to prove that it's not harmful but to show the source. It's not a big deal for me since those apps are not that big and complex, but I'm wondering if there is any way to show that the program is fully safe without sending the source code. It's basic stuff I guess, sorry if it sounds stupid an obvious.
Thanks.
Edit:
By "safe" I mean it's not a keylogger or anything like that.

Even if you strive hard to keep your app safe, when the underlying OS is vulnerable, it goes vain! So if you expect that trust, probably, if you have restrict your app to be in platforms, that you believe to be trustworthy.
For keyloggers you mean, show only virtual keyboard of your own app, donot use the system's default. Encrpyt evrything (data) you send from your app. Create a checksum value for you app, and when someone tampers it, make sure , your app recognises it and makes it unusable, till reinstalled. Have a pre installer to validate the platform, your app is being installed.
Never allow, external sources to access the app content. Secure your critical content, in a encrypted container.
may be the below link, provides some more insight!
http://www1.good.com/good-dynamics-platform/

Basically, this is the same question as "how do I know anything is safe". Consumer appliances get recalled periodically, but we trust that they aren't deliberately designed to catch fire. If you aren't sure you trust it, you run it only when/where you can keep an eye on it and/or isolate it so it can't damage more than itself, or you throw it out.
If people don't trust the source of their code, they have two choices: Don't run it, or run it in a highly isolated environment.
The latter is a large part of what the Java Applet and Java security environment is about, but of course that does require that you trust whoever wrote the browser and/or set up the security environment to have done that successfully, and you have to trust that those don't have bugs that can be exploited.
If you're talking about products... There have been various practices published from time to time regarding how code should be written, and tested, for robustness. These days those may include "white hat" security attacks along with full code inspection and so on. If you can show that you're following these practices, it may reassure some folks who otherwise wouldn't trust you... but doing them with full rigor can be expensive, so part of this is knowing what your customers expect and/or will tolerate.
In the end, the real answer is that you need to start by writing trustworthy code, then know what the customer's concerns are and make sure you can meet their requirements either by delivering perfect code and/or by delivering above-average service and/or by being... no, I'm not going to take a cheap shot at that company this time.

Related

Most efficient/secure way to implement trial period in cocoa [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am finishing a Cocoa App which will use CocoaFob for licensing and I am wondering about the "most" efficient and secure way to implement a trial period in cocoa.
Thanks in advance for your help,
Regards,
for security you need to make sure it is not in an easy to spot method, as this can be switched out at runtime. ideally it should be checked in multiple places, where disabling/modifying the method would disable Important chunks of the application(ie loading initial data).
Having said that, how much do you want to risk inconveniencing a genuine user? and how much time can you justify spending, working on something that doesn't give someone a reason to buy your application?
you also have to make sure that keys don't get redistributed, and realistically if someone is determined enough, they will pirate your application one way or another. spend just enough time to keep honest people honest.
also bare in mind that a trial version wont be able to be submitted to the mac app store, and neither will a version with license key management, so you will either be cutting your self out of that market, or distributing a version without license keys, which may get cracked anyway.
hopefully this helps, and i would be interested in reading what solution you decide to go with.
If your going to implement a time based demo consider using one based on processor time and not an absolute date.The idea being, the user can use your application fully for, say 4 hours, of CPU time. That way they are not locked into a 'must decide by date'. I often have downloaded soomething to look at, then later tried to really use it only to have the trial date expired.
It's not that hard to implement and I'm sure users would appreceiate more.

Is This Normal Development Procedure? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 months ago.
Improve this question
First a little about myself. I am not an experienced software engineer, architect or developer. I have done mostly small ASP and ASP.NET projects in C# for the last 5 years. I am pretty good with HTML and JavaScript. These projects were done when I had free time from my other duties which were not related to software development. I have now been moved into a software developer position. The company I work for is not a software development firm.
I am now working on a Silverlight LOB application with WCF and Entity Framework. I have been given little specifications for this project, just the 'make an application like X, only simpler so we don't have to pay for it', my boss doesn't check on my progress as often as I think he should, the project manager(a co-worker) will stop by now and then but we never discuss the specs, architecture, UI or business rules. I am mostly just asked when I think it will be done. I have had to learn Silverlight, WCF and Entity Framework to work on this project which is not a problem as I really enjoy working with these technologies. The problem is I am the only one in the company that knows anything about these and have no mentor/boss to discuss the problems and how they could be solved. I have been able to seek out one interested party in the company that has at least given me a list of some of the requirements.
I can't believe this is how software development should be done. I think the project managers should offer guidance and keep a closer eye on what is being done to prevent going in the wrong direction(but how can they in my situation since the don't know the technologies!).
Should I feel this way or am I way off base?
Thanks for listening.
What you describe is certainty not optimal, but it's extremely common, particularly in smaller shops. Some people find it rewarding to work in that kind of environment. It's not what the software engineering books teach, but that's why there are so many software engineering books.
If you want to continue working in this environment, you're going to have to supply all the discipline you rightly recognize as missing yourself. Write up a spec. Build a schedule. Share these with your management. Hold yourself to deadlines.
Share your concerns with your management; don't be shy about that. Chances are, they recognize the situation. Your boss doesn't check your progress? Publish your progress to him. Show him where you need to get to, how far along you are, and what's blocking you.
It'll be chaotic, no doubt, but you'll learn a lot.
Every organization is different. If they are operating in this capacity then you should adapt and make the best of the situation. It's either happening because that's how things are done and they are aware of it, or they don't know the wiser or don't want to invest to improve the process of delivering strategic/tactical projects.
In a perfect world everyone would have a robust Quality Methodology in place which would provide a framework for Project delivery and systems implementation. It's just not a reality.
Here are some tips to help you operate more effectively:
Identify your sponsors (the people who own the product) and determine the high level benefits and driving objectives of the business problem they seek to solve
Identify your stakeholders (who has influence and who has interest) and get them to communicate their needs as much as possible
Involve both sponsors and stakeholders in the process as much as possible or as much as they want
Capture what requirements you can from them through written form (email)
Provide opportunities for them to gain visibility into the delivery and to provide feedback
Your project will likely fail from your boss point of view. Because i'm sure you developing program not suitable for him. But you don't feel guilty. It's your boss' pain.('because you are good programmer). Sorry for so dark post :-).
The role of the project manager is not to know the technology, but they definitely should have a finger on the pulse of the project, so to speak. The real project management job is not to control the project, but rather to enable it. Either way, from your description, looks like yours isn't doing such a great job at it.
The other extreme is a process-heavy organization where meetings and committees decide everything, and all the real communication, if it exists at all, happens through side channels.
The ideal world lies somewhere in between.
Your project manager should not be too concerned with how you're doing things. Since they have no qualifications, the best they can do is connect you with someone who does. When they can't verify that you're building the thing right, they should at the very least ensure you're building the right thing. Even if it's for internal use, you still have a customer, and no communication with the customer spells bad news to me. :)
If your PM is not concerned about the issue, you could try to do something yourself. For example, ask the PM to connect you with a would-be end user of the application. Extract bits of your application and give them to the user to play with -- just make sure the bits you give them don't look or feel too finished.
If you can't change things, take this as a learning experience. Make sure next time you're up for a project, you know the things that went wrong last time, and try to mitigate them from the start.
And finally, if your bosses tell you this is a "more agile way" of working, punch them in the face. Agile is, or should be, synonymous with discipline, not complete lack thereof.
Good luck!
It is a hard situation. Only you can really determine the best way to proceed. However, I do think that the concern with the schedule and concurrent lack of documentation (requirements, expectations, use-case scenario documentation, etc) is a train-wreck waiting to happen. Even the sharpest and most experienced dev-teams suffer from the same problems.
The "when will it be done?" questions are best mitigated by regularly providing small partially functional builds that you can use to get useful information out of the moving target that is your customer. It is amazing how much communication can occur when somebody (your boss/customer/end-user) can actually "play with" something in front of them and reconsider what they really want.
I believe this situation is quite common. I had this, too, at my previous job. Here the bet is on the fact that you are already independent and well-versed in your business. I think you should tell your manager how you feel about this.
They should change something after hearing your opinion about this situation. Because if you do something wrong and the manager does not notice it, the company can lose a lot of money and time.
But it’s also not worth constantly waiting for someone to guide you and check your work. In any case, your workflow should have self-management.

How to hand over a project systematically? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
We have a project hand over from on shore team to our team (off shore) not long ago. However we were having difficulties for the hand-over process.
We couldn't think of any questions to ask during their design walk-through, because we were overwhelmed by the sheer amount of information. We wanted to ask, but we didn't know what to ask. Since they got no question from us, the management think that the hand-over process was been done successfully.
We had tried to go through all the documentation from our company wiki page before attending the handover presentation, but there are too many documents, we don't even know where to start with.
I wonder, are there any rules or best practices that we can follow, to ensure a successful project hand-over, either from us, or to us.
Thanks.
In terms of reading the documentation, personally I'd go for this order:
Get a short overview of the basic function of the application - what is it meant to achieve. The business case is probably the best document which will already exist.
Then the functional specification. At this point you're not trying to understand any sort of how or technology, just what the app is meant to do. If it's massive, ask them what they key business processes are and focus on those.
Then the high level technical overview. This should include an architecture diagram, required platforms, versions, config and so on. List any questions you have.
Then skim any other useful looking technical documents - certainly a FAQ if there is one, test scripts can be good too as they outline detailed "how to" type scenarios. Maybe it's just me but I find reading technical documents before I've seen the system a waste - it's too academic and they're normally shockingly written. It's certainly an area I'd limit the time I spent on if I didn't feel I was getting a reasonable return for the time I was spending.
If there are several of you arrage structured reviews between you and discuss the documents you've read, making sure you've got what you need to out of it. If the system is big then each take an area and present to the others on it - give yourselves a reason to learn as much as possible and knowing you're going to be quizzed is a good motivator. Make a list of questions where you don't understand something. Having structured reviews between you will focus your minds and make it more of an interactive task, rather than just trawling through page after page of tedious document.
Once you get face to face with them:
Start with a full system demo. Ask questions as they come up, don't let them fob you off with unclear answers - if they can't answer something have it written down and task them with getting the answer.
Now get the code checked out and running on your machines. Do this on at least two machines - one they lead, one you lead. Document the whole process - this is the most important step. If you can't get the code running you're screwed.
Go through the build process. Ensure that you can build the app (including any automated build and unit tests they may have). Note that all unit tests should pass - if they don't or if they say "oh, that one always fails" then they need to fix that before final acceptance.
Go through the install process. Do this at least twice, one they lead, once you lead. Make sure that it's documented.
Now come up with a set of common business functions carried out with the application. Use this to walk the code with them. The code base will be too big to cover the whole thing but make sure you cover a representative sample.
If there is a database or an API do a similar exercise. Come up with some standard data you might need to extract or some basic tasks you might need to carry out using the API and spend some time working through these with them.
Ask them if there's anything they think you should know.
Make sure that any questions you've written down anywhere else are answered.
You may consider it worth going through the bug list (open and closed) - start with the high priority ones and talk through anything particularly worrying looking. Even if they've fixed it it may point at a bit of code which is troublesome.
And finally if the opportunity exists - if there are any outstanding bugs or changes, see if you can pair program a couple.
Do not finally accept the app unless you are 100% sure you can:
Get the code to compile
Get the code to build (including the database)
Get the application installed
Do not accept handover is complete until they have:
Documented anything you picked up on that wasn't covered to your satisfaction
Answered ALL of your questions - a question they won't answer after being asked repeatedly screams of something they're hiding
And grab their e-mail addresses and phone numbers. Even if it's only informal they'll probably be willing to help out if the shit really hits the fan...
Good luck.
My basic process for receiving a handover would be:
Get a general overview of the app, document it
Get a list of all future work that the client expects
... all known issues
... any implementation specifics
As much up-to-date documentation they have
If possible, have them write some tests for critical components of the system (or at least get them thoroughly documented)
If there is too much documentation (possible) just confirm that it is all up to date, and make sure you find out from them where to start, if it is not clear.
Ask as many question as possible; anything that comes to mind, because you may not have the chance again.
Most handovers, perhaps all of them, will cause a lot of information to be lost. The only effective way to perform a handover that I have seen is to do it gradually. One way to do it is to allow a few key people from phase One to stay on the project well into Phase Two.
The extreme solution is to get rid of all handovers, and start using an Agile mindset.
As a start, define the exit criteria for the handover. This should be discussed, negotiated and agreed with both parties and make sure higher management knows this. Then write up a checklist of all things needed to achieve the exit criteria and chase it.
Check out "Software Requirements" and Software Requirement Patterns for ideas on questions to ask when gathering information about a project. I think that just as they would work for new development, they would also help you to come to terms with an existing project.

Should human factor be taken into account when deciding on what process to use? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
When you are deciding on what methodology or process to use for your project, should you take into account the human factors? If there is any resistance to things, do you go with the flow or force people to change?
For example, say you want to push for pair programming but the team members resist to working in that mode (or show dislikes), what would you do? Make them get used to it, try to convince them to do it or go with the flow and let them do what they like?
The human factor is the most important one.
If you consider nothing else, consider the culture and proclivities of the group.
People who want a process to fail will succeed. It's far easier to alter process than to alter people.
First try to reason, you may be wrong:
If you have resistance to certain things, you can usually give your points for why you think it's good, and hear their points for why they think it is bad, and come to some common grounds.
You should never force people into doing something against their will, but instead try to convince them based on your logical reasoning. Many times you will see reasons from them that changes your point of view.
If your developer is too afraid to voice their opinion, then you should make them feel comfortable with giving their opinion. If they are still reluctant, then you should consider new developers.
Foot in the door principle:
If you want to try some new concept that neither you nor they have experience in, say pair programming, then you can ask them to try it for 1-2 weeks and then you can sit together again after this trial period and assess the effectiveness. I think most people will find it perfectly reasonable to try something new if they have no experience in it, if it is for the purpose of finding out the method's effectiveness, and if it is only for a trial period.
If after this trial period, the thing you were testing was successful, then your developer will be more open to the idea.
Don't change them, find someone who fits:
If you are 100% for some way of doing things, and your developer is 100% against it, and he won't try it and has no logical reason why, instead of trying to change him you're better off finding a developer that will fit into your way of doing things.
If they are 100% against what you want to change, you have to make a decision. Is the developer themselves more important to you, or is the process that you want to change more important.
If you force someone into something they don't want to do, they will find a way to make your method fail.
Yes. Your development process needs to be humane. That said, there are better and worse development practices and you should strive to use the better practices. The best methodologies understand both human strengths and weaknesses and have practices that promote the former and compensate for the latter.
For example, most agile processes put a high value on trusting developers to do the right thing -- to work hard and value quality. They allow developers to have significant input into the process and into the product. This takes advantage of the human quality of rising to expectations. On the other hand, humans have trouble managing too much complexity at one time, so agile practices insist on breaking things down into manageable chunks.
On the other hand, we know that people don't like to do things that don't directly add value to their work. Agile practices, recognizing the value of things like unit testing, insist on this however and require the developer to conform to it despite the initial reluctance. Using TDD compensates for this somewhat by giving real value to developing tests -- you do them first and let them guide the design. It's a bit of the carrot and stick approach to get developers over the initial reluctance to the point where they can experience the value of the method and buy into it on their own.
Adapting the Process
The key to developing a good process with your people lies in adapting the process to the amount of ceremony that you need or want. We use the RUP where I work and one of the central goals of the RUP is to tailor the amount of ceremony in your process to fit your project and the personnel.
For instance, small projects require far less ceremony and tool support. As well, people new to a process need time to adapt. It's best not to flood them with information and let them adapt at their own pace.
Show Me the Money!
To get people to buy into a new process is to let them make a mistake (or present an example form the past) and then show them how the process could have helped prevent the mistake. Try and draw a direct line to show how the process will help them improve the way they work.
For instance: if people are resistant to automating builds and running tests automatically then the next time they release a fix for something that broke a piece of code that was already working use that opportunity to illustrate that an automated test would have caught the error before it got released, saving everyone time and money.
Automation
To ensure people can adapt to a process is to remove as much human intervention from them as you can. Automate builds, tests, reporting as much as possible using information that is automatically captured.
How this helps support process is by removing the "nag" factor. Many people resist new process because they figure it means more work for them to do or extra work that produces little result in the end. By automating existing tasks and gathering data from them you get a lot of benefit without increasing any individual developers workload.
A classic example is continuous integration. Continuous Integration tools like CruiseControl, TeamCity or Hudson can work with version control repositories to extract latest versions of source code, build that code, execute and archive test results and package stuff for deployment. This requires no extra effort on the part of the developer but you get a lot of extra "process" in return. You now know how good your source code is, you can distribute it easily and you can catch bugs earlier.

Scrum, but with no testing or documentation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
What do you do when you join a team that says they use Scrum, but only use it as a time-management tool and not the whole process?
How can I reinstate back testing and documentation?
I was thinking to start off with adding user stories specifically for testing and documenting.
Perhaps someone else has more experience with this then I do about this as I am sure its not that uncommon.
The key to scrum is that a task be identifiable as "done" before it can be classed as done. How does you company assess whether something is done without reviewing documentation and tests?
Perhaps they have an unusual, but valid, way of doing it. Or perhaps they have missed the point of "done tasks". I'd suggest you start by asking them how they measure down and whether it could be improved. Then suggest documentation and testing as the way of improving the process.
Note that neither testing nor documentation are in fact part of Scrum. Scrum is a pure project management approach - the required engineering practices, like the ones you mention, are supposed to "emerge" during the project. And most specifically, they are supposed to be identified during the heartbeat retrospectives that you do at the end of every sprint. Are you doing those? Can you bring up your concerns there - and are they actually the biggest concerns the team has?
Is the issue that they don't have any documentation and tests, or that they aren't implementing the entire Scrum methodology? Those are 2 very different problems in my mind.
I would much prefer an organization that has taken the time and effort to find and fit a development process that matches their development style as opposed to mandating down from on high the one true process. So I would not be concerned at all if they were using a process that they called Scrum but that didn't meet all the "official" guidelines. Try to determine why the process is the way it is. Chances are that if they have taken the time to tailor it, the team will be receptive to your ideas, especially if you have taken the time to determine why things are the way they are. If you simply approach it as "this isn't Scrum and so isn't right", you will probably not make much headway, but by being pragmatic about the benefits you can likely make some substantial improvements.
Alternatively, if they aren't doing testing and don't have any documentation I would consider that a fairly bad sign. And by documentation I am taking the minimalist view here - a list of features, bug tracking, etc. - I would be very concerned by the absence of these items, less concerned by the absence of items higher up the abstraction list. In the absence of support from management, I would suggest you lead by example. Take it on yourself to setup a simple bug tracking system (there are several - in a pinch, simple text lists in a central location work as well). Don't declare your features complete until someone else has tested it. This can be as simple as walking over to another developer and asking them to try it in front of you. If someone claims a feature is complete, take a few minutes to familiarize yourself with it. If you find a bug, politely mention it to the responsible developer. Slowly build an environment where the team can see the benefits of running tests and tracking features and bugs.
Most teams operate in this manner simply because of a mistaken belief that they don't have time to "do it right", or that they will get to it later. Often this will occur when a simple proof-of-concept done by a developer or two as a side-project turns into a full-on development effort. By showing that it can actually save time and effort, and reducing the initial costs to the rest of the team, you will often find that it becomes ingrained as part of the process without ever actually being officially endorsed or accepted.
If you have management support it will make it much easier, but always be careful to make sure that the team is receptive to the changes. This may mean it takes longer than you want, but so be it, without the team's support any mandated process will fail at the first sign of pressure, which is when you need the process the most.
*Disclaimer - On my last project I spearheaded the movement to tailor the SCRUM process to fit our environment. The "official" process was simply untenable for our client, but it was still an invaluable guide in tailoring our process.
"adding user stories specifically for testing and documenting"
While meta-user stories might make sense in some circles, it rarely works out well. Software folks rarely cope well with meta-user stories, they either don't get the idea that they can change their own processes by writing a story, or -- more typically -- they engineer the meta-user story to death.
When you're interviewing users, it feels like they're making the user story up. Certainly, you're making it up as you listen to them and try to capture it.
When an IT organization tries to make up its own user stories about how IT should work, the process falls apart. Until the organization has done the thing (testing, for example) a bunch of times manually, they're not really qualified to write user stories. Then, after they've done it, they don't need software development processes, they'll just automate the important bits a little at a time.
I think change has to come from a less formal direction. Actually balking at calling something "done" that hasn't been tested is a good starting point.
IT doesn't do things unless forced. So, meet the users and find out why they're not requiring testing. Coach them to require testing. Tell them the consequences and the words to use.
A lot can go wrong in an organization to lead to poor processes. It's important to know what's wrong, and create a demand for change. The best possible thing is to have your boss complaining that you're not fixing it, rather than you suggesting that perhaps it would be good to fix it.
[It doesn't feel right when your boss demands you fix the process, but it's about the only way change will happen.]