Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
My team and I are currently building multiple services in parallel. We have the benefit of building all the services from scratch. I would like the ability to automatically display all API endpoints, from all services, in one page/site. This would be helpful because (among other things):
I don't have to go to multiple documentation sites to see what are the available endpoints in my entire "system".
It'll be a good first step to determine if any of the services should be split, combined or simply refactored.
Some of our services are in Django and the rest-swagger module is a great help. But I don't see how I can combine rest-swagger documentation from multiple services into a single documentation page/site.
I'm currently looking through this site and anything related to the Netflix experience but could not find a solution to my problem. Maybe centralized documentation isn't a big deal with 600+ services at Netflix, but that's hard to believe.
Can anyone suggest a tool or method to have a combined API documentation for all services in a microservice architecture?
My ideal scenario of what happens when a service is changed:
I click on the link to see the list of endpoints in my system.
A teammate updates a service and also it's documentation.
I refresh the page I am currently and I see that change made from step #2.
With my exp, you have some paths.
http://readme.io/
Make a wiki with JIRA, Redmine.
In Github create a repo for exclusive docs.
Google Docs.
I don't know about any existing tool rather I'm just putting my thought on where to do it.
From what the OP describe, they are already building a micro services architecture using Netflix stack. There should be a repository to config the name (or URL) for each of the services and the 'config server' or 'service registry' will read from that. To me, that's the perfect place to put the reference to each of the micro-service's documentation under their own entries. This way you get the benefit of maintaining the documentation and code at same place, plus you could potentially also collect run time information like instance/connections count if you hook into the config/registry server.
Being in similar situation I am looking to adopt https://readthedocs.org/ with GIT backed.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
Currently I'm using Azure Logic apps to sync the changes to different 3rd parties.
But it's too expensive when there are massive requests/messages.
The key features:
MQ connector, which can be used as trigger.
HTTP processor, used to issue HTTP requests.
Parse json response.
Possibility to check the history.
I've done some research of Apache Nifi.
My feeling is it's not very user friendly and quite old school.
One close open source option that I know of is n8n.
But you could also explore the fixed pricing model (Integration Service Environment) that logic apps offer, which is charged by the hour instead of based on the volume coming. Depending on the volume fluctuations, you can scale up more units as required.
Also, a completely new (in preview currently) way to develop and run logic app workflows was announced, which introduces a new pricing model (same as app service or premium plan of functions).
This is introduces a docker-based deployment which allows running your logic apps anywhere too.
Apache NiFi can be used for your requirements.
Apache NiFi is an easy to use, powerful, and reliable system to process and distribute data.
It has:
ConsumeMQTT processor which subscribes to a topic and receives messages from an MQTT broker.
InvokeHTTP processor which can interact with a configurable HTTP Endpoint.
Numerous json processors.
Data Provenance feature which tracks dataflow from beginning to end.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I want to be able to contribute to Google Translate on my native language (Sinhala).
Although there is an online portal (http://translate.google.com/community/) where we can contribute to the translator by translating new phrases or validating existing translations, I would like to create my own, lightweight portal (maybe an Android app) for the contribution service. However, I was unable to find any public API for the translate contribution platform, despite a thorough Google search and a full search through the Google Translator Toolkit API forum (https://productforums.google.com/forum/m/#!categories/translator-toolkit-api) (which seems to have been closed down since the end of 2012).
Currently my best hope is to mimic the request-response sequence followed by the online portal itself. For example, the following request is used by the online portal to fetch a question list for manual translation:
GET http://translate.google.com/community/question_list?sl=en&tl=si&client=t
However, it requires that all the related cookies are properly initialized and passed with the request, which would probably not be easy to mimic in a non-browser environment (such as an Android app). Hence I believe there's a better approach (maybe a yet undocumented API?) somewhere out there.
Does anyone know of any API for accessing this translation contribution feature?
Thanks in advance.
Please note: I am NOT looking for a way to improve Google Translate itself, but for contributing to the actual translation content as described under "How can I help?" in the Google Translate Community FAQ (https://docs.google.com/document/d/1dwS4CZzgZwmvoB9pAx4A6Yytmv7itk_XE968RMiqpMY/pub#h.e1ahmpftpdum).
P.S. I was initially planning to post this question on the Web Apps Stack Exchange, but after reading this post I decided to first try it here.
I'm one of the engineers behind Translate Community and I'm really excited that you want to see it on more platforms. We're currently under active development of the site and making it more accessible on mobile platforms without having to create dedicated native apps.
For the time being, we don't anticipate releasing a public API as the platform is under active development. Until we do release a public API, please don't use any http commands you find to create a separate app. Instead, just let us know how we can make the app a better experience for you and we'll work on making it better.
Thanks!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I work for a insurance company. We have our own development department made-up of almost 150 people plus some providers (outsourcing and custom made apps pretty much). In our company my team have made what we call non-functional logic libraries. That is, software libraries to handle things that are horizontal to all the development teams in our department, e.g. Security, Webservices, Logging, Messaging and so on. Most or these tools are either made from scratch or adaptation of a de-facto standard. For example our logger is an appender based on Log4J that also saves the logging messages into a DB. We also define what libraries to use in the application, for example which framework for webservices to use. We use pretty much JavaEE and Oracle AS in all our organization (with some Websphere Application servers).
Much of these projects have their architecture documented (use cases, UML diagrams, etc) and generally the generated documentation are available.
Now what we have seen is that for users sometimes is difficult to use the the libraries we provide and the are constantly asking question or they simply don't use them.
So we are planning to generate a more friendly documentation for them, so my question is:
What are the best practices or the checklist that software documentation should have?
Something comes to my mind:
API Reference guide
Quick start Tutorial
API Generated Documentation.
Must be searchable
Web Access
What else should it have? Also, based in your experience what is the best way to maintain (keep it up-to-date) and publish this type of documentation?
Keep your documentation in version control too.
Make sure on every page it has a version number so you know where your user has been reading from.
Get a CI server going and push documentation to a LIVE documentation site upon updates.
Do documentation reviews like you would code reviews.
Dog-food it :)
Kindness,
Dan
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
In the past I have used nexo.com to share documents with sales, marketing, PR, and technical people for a small startup. But I wonder if there is a better solution to allow different types of geo-graphically dispersed workers to get to a variety of uploaded documents. I don't want to have to build or host this myself, and free or cheap is always nice.
I read about Confluence, but it seems to be way more than what I need. I simply want access-controlled folders in the cloud.
I haven't used this myself just yet, but I've heard great things about it google docs
We use s3fm for that. It's a free solution but requires an Amazon S3 account. Since we have one for our hosting needs that was an obvious choice. But given Amazon S3 bottom pricing I think it might make sense to consider open one just for that.
Love Dropbox!!! I haven't used it for setting up a lot of group access, though.
Sounds like Google Sites would help you a lot. You can set up a network of distinct Web sites -- one for sales, another for marketing, another for PR -- and upload your files to them. You can determine who has access to each site as well as each page of content.
In case anyone else checks this Q:
Wound up using filesanywhere.com - has the exact features I was looking for.
We use a combination of:
Backpack
SVN
JungleDisk
Take a look at Dropbox.
Access control is somewhat limited, but it's been working out very well for me.
Unfortunately I'm in the middle of writing such an application for a client. The best thing I can recommend is taking an existing web based file manager and adding in the permission feature.
With a big freaking huge disclaimer that I work on this as my day job:
If you're looking for feedback on those documents Backboard gives you web-based viewing and collaboration with no software required.
there is a product called docpro.this allows you to set up various security levels,routing methods etc.Its a web based one you can use for geographically dispersed team members across the globe.But its not free,But cheap i think.
Check this link
http://www.omnexsystems.com/Faq/documentpro.html
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I saw this over on slashdot and realized if I could program in dns control into some of my apps it would make life a lot easier. Someone over there recommended dynect which apparently had a wonderful api. Unfortunately not a lot else was recomended. I don't know anything about managing dns servers I mostly work with the lamp stack, so on linux I understand bind is the way to go but other then a basic setup I'd be lost. I'd rather outsource the details.
What DNS services have API's and are easy to use without breaking the bank?
I guess in the last 3 years this is a bit of a solved problem. Here are some to check out:
Amazon has a nice dns service now http://aws.amazon.com/route53/
Linode has a free api based dns if you're a customer.
Dynadot has a fee dns with an api if you're a customer.
Hey I haven't used them, but Zerigo looks promising. We will probably wind up going with them if they allow enough hosts. Their API is standard REST stuff... very straightforward.
http://www.zerigo.com/docs/apis/dns/1.1
Thanks,
Eric.
We use DjbDNS and it's backended onto MySQL so we just hit the DB to make changes and periodically rebuild the the config data.
Has anyone seen any of the following DNS providers with APIs:
http://durabledns.com/
https://dnsimple.com/ (also supports registration by API)
http://www.loaddns.com/
We use Zonomi. Its very cheap and never gone down for us. With API
You can try http://customdns.ca. I have a couple of domains with them - no problems so far. They provide RestFul API.
http://www.dns.com
Here's the link to the API documentation:
https://github.com/dnsdotcom/API_DOC/
Have fun!
Haven't used the api, but I have been using the registrar for 10+ years and never had a problem: namecheap.com
Here is the API intro.
Here is the API methods list.
Pretty comprehensive. From purchasing to host and e-mail forwarding setup.