what is broadcom's strataXGS architecture? [closed] - broadcom

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
what is Broadcom's strataXGS architecture ? Any references? I refereed the internet and Broadcom site there is no detailed explanation ? What is new in this architecture ?

I found a sales PDF that might answer some questions:
http://www.broadcom.com/docs/features/StrataXGS_Trident_II_presentation.pdf
Here are some links to white papers:
http://www.broadcom.com/products/features/cloud_scale_net.php
Some key features1:
Enables unprecedented 10/40GbE single chip switch configurations
100+ 10GbE ports with flexibility to support up to 32 40GbE ports
First integrated switch delivers NVGRE and VXLAN L2oL3 transit and gateway switch technologies
Supports industry's highest equal cost multipathing-based fat-tree networking scale on a single chip
Greater FCoE network scale enabling true LAN/SAN convergence — up to 4X increase in forwarding entries
High port density with direct attach to SFP+/QSFP modules and KR Backplanes
Integrated IEEE 1588 1-step timing solution
Delivers 40 percent reduction in bill of material costs and 30 percent better phase accuracy

Related

Does the llvm-bolt instrumentation mode result in less accurate BOLT profiles? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 28 days ago.
The community is reviewing whether to reopen this question as of 28 days ago.
Improve this question
The BOLT binary optimizer recommends using perf to profile binaries for optimization. However, if "perf is not available" they have an llvm-bolt mode which can also profile the application:
If perf record is not available to you, you may collect profile by first instrumenting the binary with BOLT and then running it.
Evidently, this is presented as a "second choice" by the BOLT authors.
What is the downside of this mode in terms of instrumentation quality? Evidently it is slower to collect the instrumentation, but is less accurate or effective at producing as input to the subsequent BOLT optimize call which produces a final optimized binary?

Is GPS data signed and/or timestamped? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Is GPS data signed and timestamped by the satellite?
No, GPS is not signed. I think there may be in future some signed signals. Maybe it is used on military part of GPS.
GPS spoofing is a well know problem, and problems are usually "solved" in hardware. If signal strength on some satellites changes quickly: do no trust it. Multiple antenna (on extreme of lorry/ship) and comparing strength of signal helps. Some directional antenna helps to know that signal expected from a satellite come from where it is expected. And often some gyroscope, compass, or and tracking, to check plausibility of data.
Note: fishing boat uses it to decoy own position (in protected area/economic exclusive zones). They causes a lot of troubles to other ships and boats (and sometime to ground equipment).

ActiveMQ vs Apollo vs RabbitMQ vs Qpid (AMQP) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am trying to figure out the best MQ option for my requirements. I need to have the ability to transfer both text and binary messages within and across geographically diverse data centers with high reliability. Fast is nice but scaling is an option as well. Support is nice to have as with RabbitMQ.
Here are some assumptions:
Use federation or shoveling messages to push identical messages across data centers.
Use AMQP to transfer binary messages and since we are a .Net/Python shop.
I want to make sure my assumptions are valid and need help with which MQ to pick. I have used ActiveMQ+MySQL in the past but I like the option of Mnesia for messaging with persistance. Also, is it alright to use AMQP 0.9 instead of 1.0. Looks like RabbitMQ support 1.0 via a plugin.
Appreciate any alternate suggestions I can get.

History of Embedded Software [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
As far as I understand it, embedded software is just software (that runs on a general purpose CPU) that has little if any user input or configuration. Embedded software powers IP routers, cars, computer mice, etc.
My question is:
When (roughly) was the historical moment when embedded software was first considered cost-effective for some applications (rather than an equal technical solution not involving embedded software)? Which applications and why?
Detail: Obviously there is a tradeoff between the cost of a CPU fast enough to perform X in software versus the cost of designing hardware that performs X.
Embedded systems date from the Apollo moon landings. Specifically the Apollo Guidance Computer (AGC) - widely held to be one of the first examples of embedded systems.
Commercially in the early 1970's early microprocessors were being employed in products, famously the 4-bit Intel 4004 used in the Busicom 141-PF. Bill Gates and Paul Allen saw the potential for embedded microprocessors early with their pre-Microsoft endeavour the Traf-O-Data traffic survey counter.
So I would suggest around 1971/72 at the introduction of the Intel 4004 and the more powerful 8-bit 8008. Note that unlike the more powerful still Intel 8080 which inspired the first home-brew microcomputers and the MITS Altair, the 4004 and 8008 were barely suitable for use a general purpose "computer" as such, and therefore embedded computing systems pre-date general purpose microcomputers.
I would dispute your characterisation of what an embedded system is; if you were asking that question here's my answer to a similar question.

zigbee and embedded system [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm new with zigbee i need you to help me learn about it and know how to implement an embedded system using it
thanks in advance
One good place to look is on the Digi site. They have several products to help you, including embedded development kits.
If you want to go Open Source, look at Source Forge. They have some open source stacks. I have not used any of them, so I cannot comment beyond knowing that they exist.
Depending on your needs, you might want to just look at XBee, which is a subset of zigbee. There are some nice development tools for XBee. I have used an XBee expansion shield with the .net micro framework and boards provided by TinyCLR to do a wireless prototype.
Creating a zigbee stack on your own would be a fairly large task, so only you can determine if there is ROI in doing so. I would be more inclined to buy it in.
Get a ZigBee Starter Kit. Lots of vendors provide one; gust Google that exact phrase.
For example: AVR 8-Bit RISC - IEEE 802.15.4/ZigBee - Tools
Or you can ZigBee on a PIC/Microchip at very low cost. http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=2112