Graduation Project on Cryptography [closed] - cryptography

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I have to do a graduation project:
Theme: Cryptography
Development time: 2 months max
I am looking for creative ideas :), not simple proof of concepts ;)
Thanks

You could create an actual implementation of a fair Mental Poker game. There's plenty of open source card games out there that you could use for the user interface part, leaving the crypto and network protocol as the main work you'd have to do.

The original SRA implementation is neither fair (quadratic residuosity leaks) nor completely private (strategy must be revealed after showdowns).
At Certimix (http://www.certimix.com) we've developed the first real-time complete mental poker protocol, but we haven't published the paper yet.
You can implement other protocols such as:
[BS03] A. Barnett and N. Smart. Mental poker revisited. In Proc. Cryptography and Coding, volume 2898 of Lecture Notes in Computer Science, pages 370--383. Springer-Verlag, December 2003.
[CDRB03] J Castellá-Roca, J. Domingo-Ferrer, A. Riera, and J. Borrell. Practical mental poker without a ttp based on homomorphic encryption. In T. Johansson and S. Maitra, editors, Progress in Cryptology, Indocrypt'2003, number 2904 in Lecture Notes in Computer Science, pages 280--294.
[Cré86] C. Crépeau. A zero-knowledge poker protocol that achieves confidentiality of the players' strategy or how to achieve an electronic poker face. In A. M. Odlyzko, editor, Advances in Cryptology - Crypto '86, volume 263, pages 239--250, Berlin, 1986. Springer-Verlag. Lecture Notes in Computer Science.
BS03 is the best, but still is too slow for more than a couple of players.

Related

Terms "Systematic", "disciplined", "quantifieable" in the IEEE software engineering definition [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
The IEEE defines software engineering in the following way:
(1) The application of a systematic, disciplined, quantifiable
approach to the development, operation, and maintenance of
software; that is, the application of engineering to
software.
Source: http://www.idi.ntnu.no/grupper/su/publ/ese/ieee-se-glossary-610.12-1990.pdf
But what does systematic, disciplined and quantifiable mean in this context? Is there any further explanation from the IEEE?
You can rely on a dictionary:
http://dictionary.reference.com/browse/systematic:
having, showing, or involving a system, method, or plan
http://dictionary.reference.com/browse/disciplined:
having or exhibiting discipline; rigorous
http://dictionary.reference.com/browse/quantifiable:
to determine, indicate, or express the quantity of.
So, applying software engineering requires the exact execution of method that is precise with quantified steps and exact procedures regarding development, operation and maintenance of software.
or as it is state at the end:
http://dictionary.reference.com/browse/engineering:
the art or science of making practical application of the knowledge of sciences, as computer science, as in the construction of software. (I changed a little bit this definition) =)

History of Embedded Software [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
As far as I understand it, embedded software is just software (that runs on a general purpose CPU) that has little if any user input or configuration. Embedded software powers IP routers, cars, computer mice, etc.
My question is:
When (roughly) was the historical moment when embedded software was first considered cost-effective for some applications (rather than an equal technical solution not involving embedded software)? Which applications and why?
Detail: Obviously there is a tradeoff between the cost of a CPU fast enough to perform X in software versus the cost of designing hardware that performs X.
Embedded systems date from the Apollo moon landings. Specifically the Apollo Guidance Computer (AGC) - widely held to be one of the first examples of embedded systems.
Commercially in the early 1970's early microprocessors were being employed in products, famously the 4-bit Intel 4004 used in the Busicom 141-PF. Bill Gates and Paul Allen saw the potential for embedded microprocessors early with their pre-Microsoft endeavour the Traf-O-Data traffic survey counter.
So I would suggest around 1971/72 at the introduction of the Intel 4004 and the more powerful 8-bit 8008. Note that unlike the more powerful still Intel 8080 which inspired the first home-brew microcomputers and the MITS Altair, the 4004 and 8008 were barely suitable for use a general purpose "computer" as such, and therefore embedded computing systems pre-date general purpose microcomputers.
I would dispute your characterisation of what an embedded system is; if you were asking that question here's my answer to a similar question.

How do I check source-code-homework for plagiarism? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
How do I tell that two source-codes (independent of their language C,Java,Lisp...) have strong indications that they could be plagiarism of each other?
Background: I going to give my first seminar on computer languages. We have prepared small exercises for major programming languages such as C/C++, Python, Java,... but also OCaml, Haskell,... to give the students some practical introduction (also into programming paradigms). We estimate to have ~300 students with more than 50 programming tasks per person. So a single person cannot check all homeworks.
I guess anti plagiarism techniques used for natural languages (essays, papers, book chapters, etc) will not work for source code, right? Also solutions to those programming tasks will have inherent similarity due to the demanded interface.
I've done a little search and found: MOSS mentioned in: Checking for code plagiarism with JavaScript and Variable renaming for plagiarism detection for C/C++
Award a small prize for detecting it. Given the possibility of a couple beers, students will pour over the net for hours, looking for matches from other students submissions.
With large fines for offences, it's self-financing and rewards students who do their own work - they want beer and are not going to leave themselves open to revenge by plagiarising work themselves!

Is Domain-Driven Design still valid right now? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
We want to implement a model or guideline for our OO designs,. We like Domain-Driven Design. Our specific question is: do you think DDD is still valid right now beyond the pattern design? If so, do other approaches or variants exist that we should evaluate ?
Fundamentally we develop enterprise web and desktop applications using Visual studio (c#).
Thanks in advance
In my opinion, DDD is as pertinent today as ever. The idea that one should strive for an Ubiquitous Language, such that the domain in code is not divorced from the domain as described by the domain experts, will probably remain a good idea for a long time, and it is easier today to focus on the domain first and consider persistence as a "secondary" problem than it used to be. It is also still true that DDD requires an important design effort, and its value is going to be proportional to how complex the domain is.
I have not written any application using the methodology, but I have been reading a lot on Event Sourcing and CQRS lately, and they both seems like a very interesting approach which should fit well with DDD (and are usually advocated by people who are DDD proponents).
I can't find it right now, but there is a video interviews of Eric Evans floating around somewhere on the web,You may be interested in watching this video of Eric Evans, which is a form of retrospective on the methodology a few years after writing the book, and what he would have done differently now.
I think DDD is quite alive (or quite as dead) as before. My opinion is that the "domain" is a hot topic today because of DSLs (Domain-Specific Languages) and MDE (Model-Driven Engineering).
You may want to learn more about a similar "domain-driven" aproach, called DSM - Domain-Specific modeling. In DSM, you can work through patterns but you also define code generators that translate your domain-specific design into working code.
Check the DSM Forum or Wikipedia for more information about DSM.
The two most notable tools right now in this area are MetaEdit+ from MetaCase and AtomWeaver from Isomeris.

Looking for some good CS videos [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I'm wondering if there are any good free videos out there of actual Computer Science courses (college) on hardware and programming...both. Anyone know where I may find a few to review some concepts?
couldn't find jack in ITunes University, just a bunch of basically promotion junk for each university, nothing of real value or actual class recordings
Here's what I found by Googling: Free Computer Science Video Lecture Courses, but some of the links are broken so you might need further Googling to get the right link.
Here are some starters:
Berkeley 2006 Spring: CS 61A The Structure and Interpretation of Computer Programs
MIT: 6.001 Structure and Interpretation of Computer Programs
Berkeley 2006 Spring: CS 61B Data Structures
Berkeley 2006 Spring: CS 61C Machine Structures
Stanford: programming methodology
Stanford: programming paradigms
I picked spring semesters because they are longer iirc. Here are some courses that are more advanced (and less dry).
Stanford: natural language processing
Stanford: machine learning
Stanford: introduction to robotics
Perhaps you want to check out the Stanford University Youtube channel. There are some courses on computer science on there.
Some of them have taught me a lot.
Google, Yahoo, and O'Reilly all have great Tech Talks on YouTube and their dev blogs.
I highly recommend Crockford on JavaScript. He gives a wonderful overview of the history of programming languages in the first lecture.
If you're interested in current technology I'd also highly recommend the Google lecture series on MapReduce the Tech Talk on CouchDB as well as the Yahoo talk on Node.js