Can i search encrypted data in Laravel - laravel-encryption

In the Laravel, I have encrypted records in my table and I would like to search these records.
The problem occur when I do search and encrypt again my search term, it encrypts it again generating random string.

Related

Decrypt encrypted column in SQL Server which is non-human readable format

Recently I have imported some production data into a table. In that table, some columns are encrypted as non-human readable format. For ex: 'Õ;Q€Kùu'.
I have tried to convert it to human readable format liket his:
SELECT CONVERT(NVARCHAR(MAX), 'Õ;Q€Kùu')
But it didn't return the correct output. Is it possible to convert the non-human readable data into human readable format?
you cannot decrypt that column. You should know encryption key. It should be in software code. However there are some MD5 decrypt sites they have encryption key database it may help.
Try search google with encrypt MD5

Regular expression search of Oracle BLOB field

I have a table with a BLOB field containing SOAP-serialised .NET objects (XML).
I want to search for records representing objects with specific values against known properties. I have a working .NET client that pulls back the objects and deserialises them one at a time to check the properties; this is user-friendly but creates a huge amount of network traffic and is very slow.
Now I would like to implement a server-side search by sending a regular expression to a stored procedure that will search the text inside the BLOB. Is this possible?
I have tried casting the column to varchar2 using utl_raw.cast_to_varchar2, but the length of the text is too long (in some cases 100KB).
dbms_lob.inst allows me to search the text field for a substring, but with such a complex XML structure I would like the additional flexibility offered by regular expressions.

How do you do a PostgreSQL fulltext search on encoded or encrypted data?

For various reasons that don't matter here we are storing chunks of text in either an encrypted or base64 encoded format in PostgreSQL. However, we want to be able to use PostgreSQL's fulltext search to find and return data which in its unencrypted/decoded form matches a search query.
How would one go about accomplishing this? I've seen other posts mention the ability to build the tsvector values before sending data to the database, but I was hoping there would be something available on the Postgres end of things (at least for the base64 text).
Encrypted values
For encrypted values you can't. Even if you created the tsvector client-side, the tsvector would contain a form of the encrypted text so it wouldn't be acceptable for most applications. Observe:
regress=> SELECT to_tsvector('my secret password is CandyStrip3r');
to_tsvector
------------------------------------------
'candystrip3r':5 'password':3 'secret':2
(1 row)
... whoops. It doesn't matter if you create that value client side instead of using to_tsvector, it'll still have your password in cleartext. You could encrypt the tsvector, but then you couldn't use it for fulltext seach.
Sure, given the encrypted value:
CREATE EXTENSION pgcrypto;
regress=> SELECT encrypt( convert_to('my s3kritPassw1rd','utf-8'), '\xdeadbeef', 'aes');
encrypt
--------------------------------------------------------------------
\x10441717bfc843677d2b76ac357a55ac5566ffe737105332552f98c2338480ff
(1 row)
you can (but shouldn't) do something like this:
regress=> SELECT to_tsvector( convert_from(decrypt('\x10441717bfc843677d2b76ac357a55ac5566ffe737105332552f98c2338480ff', '\xdeadbeef', 'aes'), 'utf-8') );
to_tsvector
--------------------
's3kritpassw1rd':2
(1 row)
... but if the problems with that aren't immediately obvious after scrolling right in the code display box then you should really be getting somebody else to do your security design for you ;-)
There's been tons of research on ways to perform operations on encrypted values without decrypting them, like adding two encrypted numbers together to produce a result that's encrypted with the same key, so the process doing the adding doesn't need the ability to decrypt the inputs in order to get the output. It's possible some of this could be applied to fts - but it's way beyond my level of expertise in the area and likely to be horribly inefficient and/or cryptographically weak anyway.
Base64-encoded values
For base64 you decode the base64 before feeding it into to_tsvector. Because decode returns a bytea and you know the encoded data is text you need to use convert_from to decode the bytea into text in the database encoding, eg:
regress=> SELECT encode(convert_to('some text to search','utf-8'), 'base64');
encode
------------------------------
c29tZSB0ZXh0IHRvIHNlYXJjaA==
(1 row)
regress=> SELECT to_tsvector(convert_from( decode('c29tZSB0ZXh0IHRvIHNlYXJjaA==', 'base64'), getdatabaseencoding() ));
to_tsvector
---------------------
'search':4 'text':2
(1 row)
In this case I've used the database encoding as the input to convert_from, but you need to make sure you use the encoding that the underlying base64 encoded text was in. Your application is responsible for getting this right. I suggest either storing the encoding in a 2nd column or ensuring that your application always encodes the text as utf-8 before applying base64 encoding.

Update a Term's value before searching

I am using Lucene 3.6.1.
Do you know if there is a way to change a Term's value (Term.text()) before Lucene actually perform the search on the Document holding this Term ?
I need this preprocessing because the value is encrypted when written in the index. I although need to do fuzzy search and/or approximate search when searching on this Term.
Best regards.
You want to change a value stored in the index BEFORE you've found it? No, that doesn't make sense.
If you are storing data encrypted in the index, you'll need to search it using encrypted data. If you need to be able take advantage proper text searching, you will simply need to index it in an unencrypted form. Unless you are using some form of encryption that is friendly to text searching, I guess. I suppose if it were a simple cipher or something, you could encrypt both the indexed value and the query and search just fine. Apart from that, though, I don't think employing fuzzy searches on encrypted data is going to be feasible.
My Recommendation:
You could index, but not store, an unencrypted form of the field, allowing you to take advantage of searching as you need.
A field could then be created storing encrypted field to house the retrievable version of the field. Whether you index that field or not depends on whether you may, in some cases, which to search using encrypted data, but I would guess not.
Something like:
Document.add(new Field('fieldname', value, Field.Store.NO, Field.Index.ANALYZED);
Document.add(new Field('fieldnameencrypted', value, Field.Store.YES, Field.Index.NO);
Only fieldname can be searched, but only fieldnameencrypted can be retrieved from a found document (in it's encrypted form).

Lucene field from TokenStream with stored values

I have a field which needs to come from a token stream; it cannot be instantiated with a string and then analyzed into tokens. For example, I might want to combine the data from multiple columns (in my RDBMS) into a single Lucene field, but I want to analyze each column in its own way. So I cannot simply concat them all as a single string then analyze the resulting string.
The problem I am running into now is that fields created from token streams cannot be stored, which makes sense in the general case since the stream may not have an obvious string representation. However, I know the string representation, and I would like to store that.
I tried adding the same field twice, once with it being stored and having string data and once with it coming from a token stream, but it seems that this can't be done. Apart from some hack like adding a field with a name of "myfield__stored" is there a way to do this?
I am using 2.9.2.
I found a way. You can sneak it in by instantiating it as a normal field but calling SetTokenStream later:
Field f = new Field(Name, StringValue, Store, Analyzed, TV);
f.SetTokenStream(TokenStreamValue);
Because the reader/string value is only indexed if the token stream value is null, the token stream value will be indexed. The store methods look at string/reader regardless of token stream, so it will be this value which is stored.