Do any payment gateways accept negative amounts? - omnipay

I'm writing some tests to go with additional validation on the OmniPay core AbstractRequest. I accept this may not be the best place to ask, but I am seeking answers from people who have had direct coding experience of the range of payment gateways that OmniPay supports.
Basically, should OmniPay accept a negative value as a valid amount? The amount is used for authorisation and payments mainly, but maybe there are other transaction types that may need to support negative amounts. Or are there?
If negative amounts are not supported in the core by default, then is there a preferred way to implement that particular validation rule so that an OmniPay driver can easily turn it off with the minimum of effort. A "negative_values_allowed" boolean property of the AbstractRequest perhaps? Or maybe a method to handle validateNegativeAmount() is preferable?

Related

DB structure/architecture with a auth SASS

My team and I are considering using an authentication SASS.
I am definitely sure that the SASS solution will eventually be more secure than the hand made one (even using proper libs) and in our case we can afford the money.
But the thing that makes me hesitate the most is how this service will discuss with the rest of my app. Will it actually simplify our code or make it a more complicated knot bag in the end?
I understand that user list with credentials, and eventual attributes are stored there.
But then, what should I store in my app's (SQL) DB?
Say I have users that belong to companies and other assets on a 1 - n relationship.
I like to write things like:
if current_user.company.assets includes current_user.assets do
// logic here
end
Should I:
only store userIds in these tables?
=> But then, I can't user relationships between user attributes and rest of the DB attributes
store some kind of cached data in a so-called sessions table so I can use it as a disposable table of active users?
=> It feels less secure and implies duplicated content which kind of sucks.
making the user object virtual loaded with the auth SASS data and use it there.
=> Slightly better, but I can't make queries over users, even company.users is not available
other proposition?
I'm highly confused on the profits of externalizing what's usually the core object of an app. Am I thinking too monolithically? :-D
Can anyone make suggestions? Even better would be feedback from devs who implemented it.
I found articles on the web, they talk about security and ease of implementation, but don't tackle this question properly.
Cheers, I hope the question doesn't get closed as I'm very curious about the answer.
I'm highly confused on the profits of externalizing what's usually the core object of an app. Am I thinking too monolithically? :-D
Yes, you are thinking too monolithically by assuming that you have to write and control all the code. You have asked a question about essentially outsourcing Authentication to an existing SASS based solution, when you could just as easily write your own. This is a common mistaken assumption that many developers make, especially in the area of Security for applications.
Authentication is a core requirement for many solutions, but it is very rarely a core aspect or feature of the solution.
By writing your own solution to what is a generally standard concept (Authentication) you have to write, test and maintain your logic, including keeping up to date with latest security trends over the lifetime of the product. In terms of direct Profit/Cost:
Costs you a lot of time and effort to get it right
Your own solution will add a layer of technical debt, future developers (internal or external) will need to familiarise themselves with your implementation before they can even start maintenance or improvement work
You are directly assuming all the risks and responsibilities to maintain the security of the solution and its data.
Depending on the type of data and jurisdiction of your application you may be asked down the track to implement multi-factor authentication or to force all users to re-register to adopt stronger security protocols, this can be a lot of effort for your own solution, or a simple tick of a box in the configuration of your Authentication provider.
Business / Data Schema
You need to be careful to separate the two concepts of Authentication and a User in the business domain. Regardless of where or what methodology you use to Authenticate your users, from a data integrity point of view it is important that there is a User concept in the database to associate related data for each user.
So there is no way around it, your business domain logic requires a table to represent a User in this business domain.
This User table should have an arbitrary Primary Key that is specific to the Application domain, and in that table store the token that that is used to map that business user to the Authentication process. Then throughout your model, you can create FK references back to the user table.
In this way it may be possible for you to map users to multiple different providers, or to easily change the provider with minimal or zero impact on the rest of the business domain model.
What is important from a business process point of view is that the application can resolve the correct business User from the token or claims provided in the response from the authentication provider.
Authentication
If SSO (Single Sign On) is appealing to you then the choice of which Authentication provider to use can become an issue depending on the nature of your solution and the type of users who will be Authenticating. If the solution is tenanted to other businesses and offers B2B, or B2C focused activities then an Enterprise authentication solution like Azure AD, or Google Cloud Identity might make sense. You would register your product in the client's authentication domain so that they can manage their users and access levels.
If the solution is more public focussed then you might consider other social media Authentication providers as a means of simplifying Authentication for users rather than forcing them to use your own bespoke Authentication process that they will invariably forget their password too...
You haven't mentioned what language or runtime you are considering, however if you do choose to write your own Authentication service, as a bare minimum you should consider implementing an OAuth 2.0 implementation to ensure that your solution adheres to standard practises and is compatible with other providers chould you choose to use them later.
In a .NET based environment I would suggest Identity Server 4 as a base level of security, there are a lot of resources on implementation, other frameworks should have similar projects or providers that you can host yourself. The point is that by using a standard implementation of your own Authentication Service, rather than writing your own one that is integrated into your software you are not re-inventing anything, there is a lot of commercial and community support available to help you minimise the effort and cost to get things up and running.
Conclusion
Ultimately, if you are concerned with Profit, and lets face it most of us are, then the idea that you would re-create the wheel, just because you can adds a direct implementation and long term maintenance Cost and so will directly reduce Profitability, especially when the effort to implement existing Authentication providers into your solution is really low.
Even if you choose today to implement your own Authentication Service, it would be wise to implement it in such a way that you could easily offload that workload to an external provider, it is the natural evolution of security for small to mid sized applications when users start to demand more stringent security requirements or additional features than it is cost effective to provide in your native runtime.
Once security is implemented in your application the rest of the business process generally evolves and we neglect to come back and review authentication until after a breach, if we or the client ever detect such an event, for this reason it is important that we get security as right as we can from the very start of a solution.
Whilst not directly related, this discussion reminds me of my faviourite quote from Eric Lippert in a comment on an SO blog
Eric Lippert on What senior developers can learn from beginners
...The notion that programming can be principled — that we proceed by understanding the abstractions afforded by the language, and then match those abstractions to a model of the business domain of the program — is apparently never taught to a great many programmers. Rather, many programmers proceed as though they’re exploring an undiscovered country, and going down paths more or less at random and hoping they end up somewhere good, no matter how twisted the path is that gets them there...
One of the reasons that we use external Authentication Providers is that the plethroa of developers who have come before us have already learnt the hard lessons on what to do, or not to do and have evolved a set of standards and protocols to provide best practice guidelines on how to protect our users and their data when they are using our software. Many of these external providers represent best practice implementations and they maintain them for us as the standards continue to evolve, so that we don't have to.

Is there a way to limit request frequency and sizes out of the box in ktor?

Simply, I have methods like post("/login") { ... } and post("/uploadAvatarImage") { ... }, and I want to limit the sizes appropriately. If I were to hand roll this myself, I could either find some way to hook into every request and throw if a rolling rate is higher than acceptable, or create overrides of post to include things like rate limit and size restrictions. It would be nice if this were all definable globally in some of the configuration sections, so I don't have to manually check whether a user supplied a 2 billion character user name.
Is this a feature that exists in ktor? Or anyone have a good example of a nice way to do it, if there exists something better than what I was thinking?
To answer your question about limiting the size, I am assuming in the body of these requests you are deserialising to some kind of data class? If so the valiktor library (link) makes this very easy to add a max size limit on a particular property.
In regards to rate limiting, I wouldn't really expect this from the framework itself, however it seems someone has made a feature for this (link).
It is interesting I mean Spring Boot itself doesn't have rate limiting either but rather its competitor to nginx, Spring Cloud Gateway, has a rate limiting feature. It kind of poses the question of whether it is necessary to think about the architecture of introducing rate limiting at another layer (such as a gateway entry point) before reaching your app.

Benefits of using Auth0 with a Parse Server/React Native application?

I'm wondering about additional complexities involved in integrating with Auth0 vs plenty of available code for password complexity rules, UI etc. including using snowflake starter app, for authentication/user creation with open source parse server.
I am sure plenty of people have thought about this and was wondering what the consensus is? Requirement to keep profile/email in sync, relying on 3rd party, inability to customize view and I am sure many other issues.
At first I thought this is great, I would not need to worry about a lot of things, yet there are a lot of other things to worry about and not being able to customize.
What are the most convincing "PRO" answers?
Disclosure: I'm an Auth0 engineer.
TL;DR: I can talk about the pros and cons, but the definitive answer needs to be provided by you.
A bit about Auth0
Auth0 supports the authentication protocols in most widespread use (OAuth2/OIDC, SAML and WS-Federation) so integration into custom software that speaks or can be made to speak by available libraries is relatively easy and friction free. Sidenote, Parse Server does seem to support integration with OAuth compliant identity providers.
It can be used as a standalone identity provider where your users register and authenticate with username/password credentials specific to your application, but it can also integrate with a lot of downstream identity providers like for example Google, GitHub and Twitter. This makes it really easy to enable different methods of authenticating users just by configuration instead of having to directly talk to each individual provider and have to deal with their implementation discrepancies.
Finally, it provides enough extensibility points for you to still have significant degree of control on the authentication experience, for example:
Rules (JS code) allow you to customize the authentication process
Customization of Auth0 provided authentication user interface allows you to still have your own branding
Customization of Lock allows you to have a custom authentication experience integrated in your own app really quick and with very little effort
Of course, no matter how many extension points there's always some stuff that you will not be able to control. This can be seen as bad, but sometimes it's actually a good thing. Depends on the perspective and your specific requirements.
Comparison - Roll your own (RYO) vs Third-party service
On one side you'll have:
cost of acquisition of the service
cost of integration of the service with your product
On the other side you'll have:
cost of implementing the features you need
cost of ownership of those features
cost of integration of the new features in your product
In both cases you'll need some integration work in order to make all the parts fit no matter who created the parts. You could argue that if you are the author of everything it will be easier to fit a square peg in a round hole so lets say RYO wins by a small margin on that point.
It then all comes to comparing cost of acquisition versus the cost of implementation and ownership. I can't answer that one, but the cost of acquisition is generally easy to calculate while the cost of implementing software is very hard to predict; on top of that owning a custom authentication solution also has a heavy toll... you know what they say, no one ever got fired by buying IBM. I won't go that further, but it's safe to say it's easier to find yourself in a pickle if you roll your own security. :)
Conclusions
Go through the Auth0 trial, play with it and see what it has to offer and how that matches your requirements.
If you find something you're not able to accomplish leave a question here tagged auth0 or on Auth0 Forums.

How to optimise Google Translate API calls to translate multiple words in a single request

Everyone. Recently Google Translate Is Integrated Into My Project, Which Plays The Role Of Translating Some Product Names, Product Descriptions, Product Related Category Names. But Cause There Are Plenty Of Products In My Database(And Increased Quickly), Google Translate Api Would Cost Considerable Money.
I Want To Translate By Google As Less As Possible. In The Translation, Many Words Are Same Among Many Products, For Example : 阿迪达斯 - Adidas, 苹果 - iphone, 篮球 - Basketball, Bla Bla..... I Wanna Do Some Tricks, But Find No Idea.
Did Anyone Encounter Such Questions?
Any Help Would Be Appreciated.
It sounds like what you need is actually the ability to reuse translation at the string or substring level (in other words, per database entry). You can't really do that with Google, that I know of. You've got a few options, as I see it:
You could switch over to Microsoft Translator and use their methods
that allow you to place translations yourself, such as their
Collaborative Translation feature that lets you override the MT with
a preferred translation and even to vote translations up/down. Quality here will be broadly comparable to Google (I often find it better), and you have methods at your disposal that allow this override. Also, unlike Google, the Microsoft API is free up to a certain volume. Take a look:
http://www.microsoft.com/en-us/translator/developers.aspx
Microsoft also has a unique feature called the Microsoft Translator Hub, which can use your terminology, for example, for translations. However,depending on how you implemented any solution with Microsoft, you might still have the problem that you are making more calls out to Microsoft than you'd like, and, moreover, that "matching" only takes place at the level of a whole record or string, so it would not hit the case of shared linguistic elements being concatenated into one string.
There's a commercial offering called GeoFluent (full disclosure--I am the product manager for this product, so I'm clearly biased :)) that works with Microsoft Translator but provides pre and post translation processing that can deal with sub-segment and may reduce the volume you are therefore putting through translation each time. It could make sense if, as you mention, you are rapidly adding to your database. Of course, this is a commercial offering too, so you'd have to balance the costs.
Let me know if this helps, and happy to answer any other questions you have.
Marcus
There is a PHP sample here : http://weblite.ca/svn/dataface/modules/tm/trunk/lib/googleTranslatePlugin.php
That allows you to send and array and return an array.
array(source=>target) getTranslations()
translates all of the user provided strings into the target language using the Google Translate API and returns an array of source=>target
strings.

How long does it take to do a yodlee implementation?

I'm a non-technical (well, non-software. hardware background) founder who has hired a pretty good developer that has built a site with backend on Rails and frontend with CSS/HTML pretty capably. our next step is to develop a Yodlee integration, and we both want to know how long it takes to do this. He has an estimate which I think is reasonable, but would like feedback from the community without biasing the responses.
Also, if anybody has done an implementation before, I would really appreciate your perspective and help!
I have implemented a complex Yodlee integration for a LA based start-up over the last two years. They built a social game and money management platform on top of it. The short answer is that it's tough and dirty work.
The technical aspect of getting your application to communicate to the Yodlee API is not at all the hard part (its pretty much a standard web service). Following are some aspects highlighting the difficulty:
The most difficult part is dealing with the unknowns and the variability in the client data.
There is effectively no documentation for the API
There are several way to do each operation that will return different data
Ive been designing and building systems for 15 years and have gotten pretty good at estimating projects. We were way off with Yodlee; in fact we are still dealing with issues.
In order to understand why its so tough, you really need to understand what Yodlee is.. it is an aggregator of 10,000 different systems. Now these other systems might be big professional systems like Bank of America, Chase, ... but they are often small little banks (Bob's Bank in Omaha).
When Yodlee communicates with the big companies (they are called content services) there is most always an api that actually returns good data. But with the little ones, they are doing screen scraping. You can imagine that breaks all the time. They have an entire team in India which is just focused on that.
The other issue is about modelling the data; each of the content services at its source has modeled the data differentley (different names, different elements, different relationships,...) but Yodlee but combine all 10,000 models into one view. What this leaves you with is a very bloated model, where you can never know or count on getting a certain data element.
To give you an idea... there are extra fields about a credit account (apr, credit amount, last payment, ...) beyond the standard base'class fields (balance, ...). While this sounds great that you have this data, in practice the number of content services that provide these extra data elements is so low that you cant really depend on them. I'd say that the fidelity of those data elements is very low. All you can really count on is the base elements (account name, type, balance) and (transaction date, description and type).
Speaking of transactions... their transaction categorization system is not that good. They have clearly taken a breadth first approach to this, rather than focus on accuracy. We built an entire system for transaction categorization which is far more effective.
A couple other things: The DAG account test system is useless; it does not operate the same way real accounts do. You will be far better off opening 5-10 accounts at different content services and giving your developers the username/passwords for these for testing. The MFA (multifactor auth) system for account security has been an endless headache. This isnt Yodlee's fault, its the nature of the game. The banks are doing more and more crazy things that add security layers. Yodlee has the MFA system in place to compensate for this. At any given time about 20% of our accounts are in error for some reason. We have built an entire component just to manage this.
So what does this all mean? Double your estimate, get ready to get dirty. I dont want to put Yodlee down at all (except for the lack of documentation); they really are solving a hard problem. There really arent any other better options.
I run the team responsible for sales and support of the Yodlee APIs so the response may be a little biased.
I have seen clients get up and running in anywhere from 10 days to 3 months to 6 months. The time to implement depends on the number of fields in the data model you are using and how you are going to use the data or manipulate it before presenting it to your users.
While the most prevalent data fields such as account balance or transaction amount will always be available, Craig is right, as you get into the broader data model you will have to code for exceptions when the data is not there. Yodlee does provide documentation on how often the fields will be available to help with this process. But if you are only going to be using basic account and transactional information, you will not have to worry about these complexities and it will speed implementation.
How you use the data once you receive it from Yodlee will also play a big part in the time it takes to get integrated. If you are deriving additional data from the transaction descriptions or are doing something with categorization then there is more complexity and it will require more time. If you are using many of the fields as-is, then this will be easier.
The other item that Craig mentioned is the extra security questions (Multi-factor Authentication). While that section of the API does add some work, we have added documentation around this to make integration easier. Also, with any development issues that come up we give clients access to a developer forum that is monitored by our Technical Consulting team.