Is it possible to offload modular exponentiation required in RSA and Diffie-Hellman to Hardware (Crypto Co-processor) that comes with Octeon II platform.
In some document, I have seen that Octeon II platform do support that. But, I am not getting how to do that. What macros to be used etc?
Can anyone please provide me with some pointer?
These look like proprietary functions to me. In that case you can these libraries in proprietary SDKs. In this case the OCTEON Development Kits page specifies:
Libraries: ‘C’ acceleration libraries for compression/decompression, pattern matching, encryption/decryption, Robust Header Compression
Generally encryption/decryption should be seen as a substitute for the word cryptography as most people don't know that encryption/decryption is just part of the crypto spectrum.
For more info, please contact Cavium.
Related
For a VB.NET-based VSTO add-in we need to plan for a future port to TwinBasic which is based on unmanaged COM architecture.
Especially for encrypted data we need to make sure we use an encryption algorithm that is identical whether the code is managed VB.NET or unmanaged TwinBasic.
I read the following:
“In the .NET Framework, the classes in the http://System.Security.Cryptography namespace manage many details of cryptography for you. Some are wrappers for the unmanaged Microsoft Cryptography API (CryptoAPI)”
http://www.vb-net.com/VB2015/Language/Strings.%20Culture,%20Convert,%20Validate,%20Crypt.pdf
Figuring out which System.Security.Cryptography classes are unmanaged Microsoft Cryptography API (CryptoAPI) wrappers is not something I know how to do well.
Which encryption algorithm in System.Security.Cryptography should I use, one that is decently solid (and hopefully uses hardware encryption for performance), so that I can use the identical algorithm later with unmanaged TwinBasic code in CryptoAPI?
I will use the algorithm both to encrypt/decrypt separate strings as well as entire JSON files.
Your experienced advice and considerations are very helpful thank you!
FYI TwinBasic is VB6 extended with a lot of new functionality and language features please see more at twinbasic.com.
As stated by #wqw in a comment on Jun 24 2021:
Just use some AEAD cipher like AES in GCM mode and you're future
proof. AES is supported by each and every crypto library and is
implemented in hardware on most CPUs.
In the Yosys manual I read
C.108 read
-sv2005 -sv2009 -sv2012
load HDL designs Load the specified Verilog/SystemVerilog files. (Full SystemVerilog support is only available via Verific.)
C.113 read_verilog – read modules from Verilog file
-sv enable support for SystemVerilog features. (only a small subset of SystemVerilog is supported)
Is there a concise spec anywhere about this? If not, a guidline? Which Verilog and which SystemVerilog?
What is Verific?
In Clifford: A Free and Open Source Verilog-to-Bitstream Flow for iCE40 FPGAs Wolf says that "Verilog is pretty much everything from Verilog 2005". What not from Verilog 2005? Changes over time from this late 2015 lecture?
Verific is a reference to a commercial front-end provided by Verific Design Automation. There's a commercial version of Yosys sold as part of the Symbiotic EDA Suite that comes with this Verific front end installed. The Verific front end provides full VHDL+SV support as you see above.
The open source version of Yosys officially supports only Verilog 2005. Unofficially, several SV features have been added to it, (Enum, Typedef, etc.) and there's a beta VHDL support provided by GHDL-synth that's a work in progress.
It looks like the folks at Antmicro were kind enough to provide a frontend that allows direct reading of SystemVerilog. Check the following blogpost https://antmicro.com/blog/2022/02/simplifying-open-source-sv-synthesis-with-the-yosys-uhdm-plugin/
I was recently asked about how to use a C library (Cello in this case) in an embedded environment, but I'm not sure how to go about that.
Is it correct to say that if a library can be compiled in the embedded environment, it can be used?
Should I care about making the library more lightweight or something like that?
Any suggestions are appreciated.
To have it compile is the bare minimum. Notably most embedded systems are freestanding systems, such as microcontroller and RTOS applications. Compilers for freestanding systems need not provide all standard library headers, the only mandatory ones are (C17 4/6):
<float.h>, <iso646.h>, <limits.h>, <stdalign.h>, <stdarg.h>, <stdbool.h>,
<stddef.h>, <stdint.h>, <stdnoreturn.h>
In addition, the embedded system need not support floating point arithmetic. Some systems implement software floating point support, but using that is very bad practice. If your MCU does not have a FPU, you should not be using floating point arithmetic, or you picked the wrong MCU for the task, period.
"I need to represent this number with decimals internally or to the user" is not a valid reason for using floating point. Fixed point arithmetic should be used for that. You only need floating point if you are to use math libraries like math.h and more advanced math.
Traditionally, embedded system compilers have been slow to adapt the latest C standard. It's been quite a while since C11 release now though, so at the moment all useful compilers have caught up with it (C17 only contains minor changes so we can likely ignore that one). Historically, embedded compilers have been horribly bad at this though, so remain sceptical. There shouldn't be any reason to pick a compiler without C11 support for new product development.
Summary for getting the lib to compile (bare minimum):
Does the library use hosted system headers, and if so does the embedded compiler support them?
Does the library use floating point and if so does the target system have a FPU, or at least a software floating point lib?
Does the library rely on the latest C standards and if so does the embedded compiler support them?
With that out of the way, you have to consider if the library is at all written to be portable. Did they take care with things like integer types, enums and alignment? Are they using stdint.h or are they using "sloppy typing" int all over the place? Did they consider endianess? Is the lib using dynamic allocation, which is banned in most embedded systems? Is it compatible with industry standards like MISRA-C? And so on.
Then there's optimizations to consider on top of that. Optimizing code for microcontrollers is very different than optimizing code for PC CPUs.
A brief glance at the various "compiler switches" (#ifdef) present usually gives a clue of how portable the code is. Looking (very briefly) at this cello lib, they seem to have considered porting between mainstream x86 systems but that's it. You would have to rewrite pretty much the whole lib if you were to port it to an embedded system. The work effort depends on how alien the target CPU is compared to x86. Porting to a high end Cortex-A with Little Endian might not require much effort. Porting to some low-end crap MCU would require a monumental effort.
Code portability is a big topic and requires very competent C programmers. To make the very same code run on for example a x86-64 and a crappy 8-bit MCU is not a trivial task.
Professional libs like protocol stacks usually come with a system port for a specific MCU, where they have not just taken generic portability in account, but also the specific system.
Not all libraries that can be compiled, can be used in embedded environments. Libraries that use malloc and free (or their C++ counterparts) are dangerous and therefore should be handled with care. These libraries can result in undeterministic behaviour because of memory allocations failing.
It is possible that the standard C STD could be wholly compiled for embedded devices but that doesn't mean that you'll have much use for printf or scanf. So a better question before you ask if you can compile it is should you use it. Cello seems like a fun experiment but isn't a stable platform to develop something real on. It can be done though and an example of that is the Espruino.
Most of the time it is a bad idea to rewrite a library to be 'lightweight' or more importantly in an embedded environment: statically allocated. You are probably not as smart as those people or won't put in the time needed to create a complete functional embedded fork which is as stable as the original or even better. Don't be dissuaded for a fun little side project but don't depend on it for a real project.
Another problem could be that the library is too big for your microcontroller. The Atmega32a only has 32KB of programmable flash. To take a C++ example of the top of my head: boost won't fit in that space even for all the highly useable tools that it provides.
Is there some direct way (without writing platform-specific code) to collect some entropy from underlying system?
Do you have any plans to add cross-platform entropy collecting mechanism? It will be very useful feature.
Our encryption support is all implemented via a cn1lib which isn't the latest and greatest either but it provides most of the basic stuff we need. We never went into that level of encryption/security when porting that library, if the native platforms support more secure randoms you can use a native interface to map to that and possibly also submit a pull request to the lib.
What are the programming features that are missing in C++ and Java ?
For eg. You can't do recursive programming in QBasic ? You can't dynamically allocate memory in QBasic.
What would be the good to have features in C++, Java.
I think Lisp Programmers will be able to add a few.
I miss lambda expressions.
This answer deals only with C++
Things I miss from the syntax, or the standard library:
RegExp as part of the standard library
Threads as part of the standard library
Pointer to member methods (not objects!)
Properties would be nice (I have seen codes that emulate this via C++ preprocessor... note an nice looking code).
Some lower level networking API (sockets!), and higher level API (give me this file from this ftp, submit "this" to this site via POST).
This is the list of things I would like to see, but I assume other people will disagree with me.
Memory garbage collector is nice.
A n interface for a GUI toolkit - let MSVC map it to win32, and on Linux... (good question!)
A stable ABI. In C it's a standard - but on C++ we are still missing a few decades. I want also stable ABI between compilers - I want to compile one library in MinGW, the other with CL and all should work.
This is the list of things I want to see, but I know they will not get away:
Compatibility with C. Really, it's a myth right now. using namespace std killed it.
Include, headers. Most of the information is already available in the DLL/so/a/"library", do we really need to keep this bad decision from 30 years ago? If needed the compilers should keep information in the binaries.
The need for Makefiles - the compiler should be smart enough to know what to do with this code, from the code itself. Pascal is doing it quite good. I think also D.
(I might be wrong, please correct me) The official standard openly and freely available for viewing. Why should I pay for the official papers? Do I need to do it for HTTP? UTF8? Unicode?
I think this is a very subjective question. From a theoretical point of view there's nothing "missing" in Java because you can do everything you want to from the perspective of the outcome as an application.
As with QBasic - recursion may not be possible but that doesn't prevent you from changing your recursive algorithm to an iterative algorithm. Programming language theory tells us that you can do this with every recursive problem. So there's also nothing missing here.
I think what you mean are features that are "nice to have" - and here everyone has to decide for himself. I'd even say there are features in the language which would have been "nice not to have" such as static imports - but again this is my subjective opinion...