Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I would just like to ask or get a few expert advices/directions on building a database and architecture for a telecommunication app.
Basiclly there are 3 parts:
Switch (low level, calls, signaling)
Backend (CRM)
Mobile
Mostly focused now on the backend part so would like to know if anyone had some experience and would be able to point in some directions for research and so on. Nothing special, things like SQL or NoSQL and just some good points where I could learn and research more! Thx
I have worked for both Intergraph and Ericsson on their telecommunications apps and can suggest a few places for you to start your research.
Domain
The telecom domain is very large, so I would suggest you determine what your audience wants you to focus on first. Are you interested in facilities management, which is where poles, lines, and equipment are located? Are you interested in logical network management, where you are worried less about the physical cables themselves and worried more about the logical circuits that "ride" these physical media? Narrowing your domain will help you tremendously.
There are quite a few resources online, but I would suggest a reference, like this one: Fundamentals of Telecom Book
Do your best to research and know your domain.
Data Modeling
For telecom data modeling, check out ESRI at www.esri.com. They specifically publish a telecom data model white paper that you might want to check out (you'll have to register to see it) at: http://downloads2.esri.com/support/TechArticles/Telecommunications_Data_Model.pdf
Also, check out the offerings from IBM at: [https://www-01.ibm.com/marketing/iwm/iwm/web/signup.do?source=sw-infomgt&S_PKG=500019725&S_CMP=is_bro11&S_TACT=109HF36W][3]
Finally, check out the products from Intergraph, Smallworld, and 3-GIS at:
http://www.intergraph.com/communications/
http://www.gedigitalenergy.com/geospatial/catalog/smallworld_network.htm
http://www.3-gis.com/
Graphs
My final bit of advice to you is to know your graph theory. Modeling telecom networks requires advanced knowledge of node-edge, edge-edge, and directed graphs. You need this knowledge to model pole-line graphs, underground duct networks, cable sheath-to-sheath connections, fiber and copper stand-to-strand and pair-to-pair connections, and the relationships between various parts of the network.
Beyond the relationships that you model with graphs, you will employ this knowledge to trace networks quickly.
Related
The question is about using a chat-bot framework in a research study, where one would like to measure the improvement of a rule-based decision process over time.
For example, we would like to understand how to improve the process of medical condition identification (and treatment) using the minimal set of guided questions and patient interaction.
Medical condition can be formulated into a work-flow rules by doctors; possible technical approach for such study would be developing an app or web site that can be accessed by patients, where they can ask free text questions that a predefined rule-based chat-bot will address. During the study there will be a doctor monitoring the collected data and improving the rules and the possible responses (and also provide new responses when the workflow has reached a dead-end), we do plan to collect the conversations and apply machine learning to generate improved work-flow tree (and questions) over time, however the plan is to do any data analysis and processing offline, there is no intention of building a full product.
This is a low budget academy study, and the PHD student has good development skills and data science knowledge (python) and will be accompanied by a fellow student that will work on the engineering side. One of the conversational-AI options recommended for data scientists was RASA.
I invested the last few days reading and playing with several chat-bots solutions: RASA, Botpress, also looked at Dialogflow and read tons of comparison material which makes it more challenging.
From the sources on the internet it seems that RASA might be a better fit for data science projects, however it would be great to get a sense of the real learning curve and how fast one can expect to have a working bot, and the especially one that has to continuously update the rules.
Few things to clarify, We do have data to generate the questions and in touch with doctors to improve the quality, it seems that we need a way to introduce participants with multiple choices and provide answers (not just free text), being in the research side there is also no need to align with any specific big provider (i.e. Google, Amazon or Microsoft) unless it has a benefit, the important consideration are time, money and felxability, we would like to have a working approach in few weeks (and continuously improve it) the whole experiment will run for no more than 3-4 months. We do need to be able to extract all the data. We are not sure about which channel is best for such study WhatsApp? Website? Other? and what are the involved complexities?
Any thoughts about the challenges and considerations about dealing with chat-bots would be valuable.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
So this might be a bit of a strange one, but I'm trying to find a tool that would help me visualize real time data in a form of a table rather than a graph/chart. There are a lot of tools out there like Grafana, Kibana, Tableau that kind of fit a similar purpose, but for a very different application and they're primarily made for aggregated data visualization.
I am essentially looking to build something like what a departure board is at an airport. You got flight flight AAA that landed 20 minutes ago, XXX departing in 50 minutes, once flight AAA is clear it disappears from the departure board etc. Only I want to have that real-time, as the input will be driven by actions users are performing on the shop floor on their RF guns.
I'd be connecting to a HANA database for this. I know it's definitely possible to build it using HTML5, Ajax and Websocket but before I get on the journey of building it myself I want to see if there's anything out there that somebody else has already done better.
Surely there's something there already - especially in the manufacturing/warehousing space where having real-time information on big screens is of big benefit?
Thanks,
Linas M.
Based on your description I think you might be looking for a dashboard solution.
Dashboards are used in many scenarios, especially where an overview of the past/current/expected state of a process is required.
This can be aggregated data (e.g. how long a queue is, how many tellers are occupied/available, what the throughput of your process is, etc.) or individual data (e.g. which cashier desk is open, which team player is online, etc.).
The real-time part of your question really boils down to what you define to be real-time.
Commonly, it’s something like “information delivery that is quick enough to be able to make a difference”.
So, if, for example, I have a dashboard that tells me that I will likely be short of, say, service staff tomorrow evening (based on my reservations) then to make a difference I need to know this as soon as possible (so I can call more staff for tomorrows shift). It won’t matter much if the data takes 5 or 10 minutes from the system entry to the dashboard, but when I only learn about it tomorrow afternoon, that’s too late.
Thus, if you’re after a dashboard solution, then there are in fact many tools available and you already mentioned some of them.
Others would be e.g. SAP tools like Business Objects Platform or SAP Cloud Analytics. To turn those into “real-time” all you need to do is to define how soon the data needs to be present in the dashboard and set the auto-refresh period accordingly.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I'm looking for a way to determine whether a picture is explicit (is Safe For Work ) or not.
I am currently looking for an API that is capable of doing it, but so far I didn't have any success.
One of the ideas I had was to use the google search API and provide a URL to a picture, and looking whether or not it is in the results when safeSearch is enabled, but it will fail on a picture that was added before the crawler got to it.
Alternatively, I'm looking for pointers regarding what to look for in an image to determine how SFW it is. Any suggestions regarding shapes, colors or patterns?
As promised, a SFW paper from Google researchers and a patent for your study procured from this blog entry.
One of my colleagues led the development of the porn classification technology at one of the largest web companies. I will share what he told me about the development of the filter.
The definition of what is explicit varying greatly among jurisdictions so what is considered explicit in the US might not be in other parts of the world and vice-versa. So models need to take into account the users origin.
A purely imaged based approach is almost impossible to use effectively at web scale. The feature space is very complex in terms of how humans judge what is explicit and what is not and developing appropriate feature extraction technology for images proved to be exceedingly difficult.
Some of the most predictive features are the text on pages that link to the images. These are among the easiest features to develop also.
Building labeled training sets is very difficult since classifying porn and other explicit content for 8 hours a day tends to take a toll on the labelers. Because of this the turn over is fairly high with almost no one lasting a year.
Getting a high accuracy from the classifiers is still very, very difficult. They worked on it with several PhD's and a very experienced team and still did not achieve the accuracy that you are probably looking for.
If you have a more constrained problem space you can probably achieve a higher accuracy. If you are using image features only the algorithm or model will probably not generalize well and will have a high false positive rate. Best of luck.
See papers:
Detection of Pornographic Digital Images
Jorge A. Marcial-Basilio, Gualberto Aguilar-Torres, Gabriel Sánchez-Pérez, L. Karina Toscano-
Medina, and Héctor M. Pérez-Meana
Pornography Detection Using Support Vector Machine
Yu-Chun Lin (林語君) Hung-Wei Tseng (曾宏偉) Chiou-Shann Fuh
Image-Based Pornography Detection
Rigan Ap-apid
De La Salle University, Manila, Philippines
You can also take some hints from existing implementations e.g.:
"The Porn Detection Stick uses advanced image analyzing algorithms that categorize images as potentially harmful by identifying facial features, flesh tone colors, image back grounds, body part shapes, and more."
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I'm looking for ideas for a Neural Networks project that I could complete in about a month or so. I'm doing it for the National Science Fair, so I need something that has some curb appeal as well since it's being judged.
It doesn't necessarily have to be completely new and unique, I'm just looking for ideas, but it should be complex enough that it would impress someone who knows about the field. My first idea was to implement a spam filter of sorts, but I recently found out that NN's aren't a very good way to do it. I've already got a basic NN simulator with Genetic Algorithms, and I'm also adding the the generic back-propagation algorithms as well.
Any ideas?
Look into Numenta's Hierarchical Temporal Memory (HTM) concept. This may be slightly off topic if the expectation is of "traditional" Neural Nets, but it is also an extremely promising avenue for Artificial Intelligence.
Although Numenta introduced HTM and its associated software platform, NuPIC, almost five years ago, the first commercial product based upon this technology was released (in beta) a few weeks ago by Vitamin D. It is called Vitamin D Video and essentially turns any webcam or IP camera into a sophisticated video monitoring system, recognizing classes of items (say persons vs. cats or other animals) in the video feed.
With the proper setup, this type of application could make for an interesting display at the Science Fair, one with much "curb appeal".
To wet your appetite or even get your feet wet with HTM technology you can download NuPIC and check its various sample applications. Chances are that you may find something that meets typical criteria of both geekness and coolness for science fairs.
Generally, HTMs aim at solving problems which are simple for humans but difficult for computers; such a statement is somewhat of a generic/applicable to Neural Nets, but HTMs take this to the "next level".
Although written in C (I think) NuPIC is typically interfaced in Python, which makes it a convenient test bed for simple yet sophisticated proofs of concept applications.
You could always try to play around with a neural network and stock courses, if I had a month of spare time for a neural network implementation, thats what I would play with.
A friend of mine in college wrote a NN to play go on a 9x9 board.
I don't think it ever got very good, but I think it would be fun to try.
Look on how a bidirectional associative memory compare with other classical edit distance algorithms (Levenshtein, Damerau-Levenshtein etc) for typo correction. Also consider the articles on hebbian unlearning while training your NN - it seems that the confabulation phenomena is avoided.
I've done some works on top of NN, mainly an XML based language (Neural XML). See details here
http://amazedsaint.blogspot.com/search/label/Neural%20Network
Also, one interesting .NET Neural network project is Aforge.net - Check out that as well..
You can implement the game Cellz or create a controller for it. It was first created by Simon M Lucas. It's a nice and interesting game, and i'm sure that everyone will love it. I used it also for a school project and it turned out very ok.
You can find in that page some links to other interesting games.
How about applying it to predicting exchange rate (USD - EUR for example for sub minute trading) should be fun to show net gain of money over 1 month.
I doubt this will work for trades longer than a minute... without a lot of extra work.
I like using committee machines so why not apply it to Face-Detection in images / movies or voice print authentication.
Finally you could get it to play pleasing music and use a crowd sourcing fitness function whereby people vote for the best "musicians"
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Here are the estimates the system should handle:
3000+ end users
150+ offices around the world
1500+ concurrent users at peak times
10.000+ daily updates
4-5 commits per second
50-70 transactions per second (reads/searches/updates)
This will be internal only business application, dedicated to help shipping company with worldwide shipment management.
What would be your technology choice, why that choice and roughly how long would it take to implement it? Thanks.
Note: I'm not recruiting. :-)
So, you asked how I would tackle such a project. In the Smalltalk world, people seem to agree that Gemstone makes things scale somewhat magically.
So, what I'd really do is this: I'd start developing in a simple Squeak image, using SandstoneDB. Then, this moment would come where a single image begins being too slow.
GemStone then takes care of copying your public objects (those visible from a certain root) back and forth between all instances. You get sessions and enhanced query functionalities, plus quite a fast VM.
It shares data with C, Java and Ruby.
In fact, they have their own VM for ruby, which is also worth a look.
wikipedia manages much more demanding requirements with MySQL
Your volumes are significant but not likely to strain any credible RDBMS if programmed efficiently. If your team is sloppy (i.e., casually putting SQL queries directly into components which are then composed into larger components), you face the likelihood of a "multiplier" effect where one logical requirement (get the data necessary for this page) turns into a high number of physical database queries.
So, rather than focussing on the capacity of your RDBMS, you should focus on the capacity of your programmers and the degree to which your implementation language and environment facilitate profiling and refactoring.
The scenario you propose is clearly a 24x7x365 one, too, so you should also consider the need for monitoring / dashboard requirements.
There's no way to estimate development effort based on the needs you've presented; it's great that you've analyzed your transactions to this level of granularity, but the main determinant of development effort will be the domain and UI requirements.
Choose the technology your developers know and are familiar with. All major technologies out there will handle such requirements with ease.
Your daily update numbers vs commits do not add up. Four commits per second = 14,400 per hour.
You did not mention anything about expected database size.
In any case, I would concentrate my efforts on choosing a robust back end like Oracle, Sybase, MS etc. This choice will make the most difference in performance. The front end could either be a desktop app or WEB app depending on needs. Since this will be used in many offices around the world, a WEB app might make the most sense.
I'd go with MySQL or PostgreSQL. Not likely to have problems with either one for your requirements.
I love object-databases. In terms of commits-per-second and database-roundtrip, no relational database can hold up. Check out db4o. It's dead easy to learn, check out the examples!
As for the programming language and UI framework: Well, take what your team is good at. Dynamic languages with fewer meta-time wasting will probably save time.
There is not enough information provided here to give a proper recommendation. A little more due diligence is in order.
What is the IT culture like? Do they prefer lots of little servers or fewer bigger servers or big iron? What is their position on virtualization?
What is the corporate culture like? What is the political climate like? The open source offerings may very well handle the load but you may need to go with a proprietary vendor just because they are already used to navigating the political winds of a large company. Perception is important.
What is the maturity level of the organization? Do they already have an Enterprise Architecture team in place? Do they even know what EA is?
You've described the operational side but what about the analytical side? What OLAP technology are they expecting to use or already have in place?
Speaking of integration, what other systems will you need to integrate with?