how to save console result of osquery as a csv or excel in windows - osquery

I am working with OSQUERY, I want to save result of osquery to a particular file as excel or csv.
I am trying with below but not getting what I want
$ osqueryi --json 'select * from osquery_info' > res.json
$ cat res.json
{"build_distro":"10.12","build_platform":"darwin","config_hash":"e7c68185a7252c23585d53d04ecefb77b3ebf99c","config_valid":"1","extensions":"inactive","instance_id":"38201952-9a75-41dc-b2f8-188c2119cda1","pid":"26255","start_time":"1552676034","uuid":"4740D59F-699E-5B29-960B-979AAF9BBEEB","version":"3.3.0","watcher":"-1"}
]
When I fire below query
osquery> select * from time;
+---------+------+-------+-----+------+---------+---------+----------+------------+----------------+------------+------------------------------+----------------------+----------------------+--------------------+
| weekday | year | month | day | hour | minutes | seconds | timezone | local_time | local_timezone | unix_time | timestamp | datetime | iso_8601 | win_timestamp |
+---------+------+-------+-----+------+---------+---------+----------+------------+----------------+------------+------------------------------+----------------------+----------------------+--------------------+
| Friday | 2019 | 8 | 23 | 12 | 24 | 45 | UTC | 1566563085 | UTC | 1566563085 | Fri Aug 23 12:24:45 2019 UTC | 2019-08-23T12:24:45Z | 2019-08-23T12:24:45Z | 132110366857557098 |
+---------+------+-------+-----+------+---------+---------+----------+------------+----------------+------------+------------------------------+----------------------+----------------------+--------------------+
osquery>
I want to save this output into excel or csv.

osqueryi documents a --csv flag. Does that do what you want? (--json outputs json.)
Depending on what you're doing, many people use osquery as a daemon (or service) with scheduled queries.

I took help from others answers and then tried and failed so many times but finally was able to accomplish this task.
How to save an osquery result to csv file:
Please follow the syntax and form your query
C:\Program Files\osquery>osqueryi.exe --csv --separator "," "select name,action,path,enabled,state,hidden from scheduled_tasks"; >>d.csv
I have added the screenshots below please see them also.
--Double quotes "" are required in the above query , Please don't miss them.
CSV file output Screenshot
Command Line -- Screenshot

Related

Error converting pandoc tables from .md to PDF

I try to convert Markdown files to PDF with
FOR %%i IN (*.md) DO pandoc "%%~fi" -o "%%~dpni.pdf" --template=weber-export.tex --pdf-engine=xelatex
But it wont convert pandoc-tables. instead there occurs an error saying
Error producing PDF.
! Missing number, treated as zero.
<to be read again>
(
l.357 ...columnwidth - 2\tabcolsep) * \real{0.33}}
Simply Version of the tables it wont convert are :
Consumption / yield
---------------------
+--+---+--+
| | | |
| | | |
| | | |
+--+---+--+
| | | |
| | | |
| | | |
+--+---+--+
| | | |
| | | |
| | | |
+--+---+--+
and
-------------------------------------------------------------------
Type Sales unit Number
------------------------------- ------------ ----------------------
Plastic bucket 30 liters 18 buckets
-------------------------------------------------------------------
like its explained in the Pandoc User's Guid
I have installed Miktex 2.9 and pandoc 2.11.2 on Win 10
Without the tables, the conversion is doing well, even when i try out tables without the first line like
Type Sales unit Number
------------------------------- ------------ ----------------------
Plastic bucket 30 liters 18 buckets
-------------------------------------------------------------------
it works.
any suggestions? Is it a bug, or what am i doing wrong?
It seems that your custom template does not load all necessary packages.
Add
\usepackage{calc,array}
to your template to ensure the necessary commands are available.

Group based on time difference between two date values

I've searched around, but haven't been able to find anyone else with this same question.
I'm working with SQL Server (2008 R2).
Let's say I have the following three rows of data coming back from my query. What I need to do is group the first two rows into one (in either SQL Server or SSRS) based on the difference in minutes between the Start Time and the End Time (the Duration). How much time elapses between one row's End Time and the next row's Start Time is of no concern; I'm only looking at Duration.
Current result set:
+---------+------------+------------+----------+
| Vehicle | Start Time | End Time | Duration |
+---------+------------+------------+----------+
| 12 | 1:56:30 AM | 2:07:47 AM | 11 |
+---------+------------+------------+----------+
| 12 | 2:07:57 AM | 6:46:08 AM | 279 |
+---------+------------+------------+----------+
| 19 | 2:55:02 PM | 3:45:59 PM | 53 |
+---------+------------+------------+----------+
Desired result set:
+---------+------------+------------+----------+
| Vehicle | Start Time | End Time | Duration |
+---------+------------+------------+----------+
| 12 | 1:56:30 AM | 6:46:08 AM | 290 |
+---------+------------+------------+----------+
| 19 | 2:55:02 PM | 3:45:59 PM | 53 |
+---------+------------+------------+----------+
I feel like it has to be a matter of grouping, but I'm not sure how to group based on whether or not the start and end times are less than 15 minutes apart.
How can this be accomplished?
Unless I misunderstood your question, try this
Select Vehicle
,StartTime = min(StartTime)
.EndTime = max(EndTime)
,Duration = sum(Duration)
From YourTable
Group By Vehicle

Last accessed timestamp of a Netezza table?

Does anyone know of a query that gives me details on the last time a Netezza table was accessed for any of the operations (select, insert or update) ?
Depending on your setup you may want to try the following query:
select *
from _v_qryhist
where lower(qh_sql) like '%tablename %'
There are a collection of history views in Netezza that should provide the information you require.
Netezza does not track this information in the catalog, so you will typically have to mine that from the query history database, if one is configured.
Modern Netezza query history information is typically stored in a dedicated database. Depending on permissions, you may be able to see if history collection is enabled, and which database it is using with the following command. Apologies in advance for the screen-breaking wrap to come.
SYSTEM.ADMIN(ADMIN)=> show history configuration;
CONFIG_NAME | CONFIG_DBNAME | CONFIG_DBTYPE | CONFIG_TARGETTYPE | CONFIG_LEVEL | CONFIG_HOSTNAME | CONFIG_USER | CONFIG_PASSWORD | CONFIG_LOADINTERVAL | CONFIG_LOADMINTHRESHOLD | CONFIG_LOADMAXTHRESHOLD | CONFIG_DISKFULLTHRESHOLD | CONFIG_STORAGELIMIT | CONFIG_LOADRETRY | CONFIG_ENABLEHIST | CONFIG_ENABLESYSTEM | CONFIG_NEXT | CONFIG_CURRENT | CONFIG_VERSION | CONFIG_COLLECTFILTER | CONFIG_KEYSTORE_ID | CONFIG_KEY_ID | KEYSTORE_NAME | KEY_ALIAS | CONFIG_SCHEMANAME | CONFIG_NAME_DELIMITED | CONFIG_DBNAME_DELIMITED | CONFIG_USER_DELIMITED | CONFIG_SCHEMANAME_DELIMITED
-------------+---------------+---------------+-------------------+--------------+-----------------+-------------+---------------------------------------+---------------------+-------------------------+-------------------------+--------------------------+---------------------+------------------+-------------------+---------------------+-------------+----------------+----------------+----------------------+--------------------+---------------+---------------+-----------+-------------------+-----------------------+-------------------------+-----------------------+-----------------------------
ALL_HIST_V3 | NEWHISTDB | 1 | 1 | 20 | localhost | HISTUSER | aFkqABhjApzE$flT/vZ7hU0vAflmU2MmPNQ== | 5 | 4 | 20 | 0 | 250 | 1 | f | f | f | t | 3 | 1 | 0 | 0 | | | HISTUSER | f | f | f | f
(1 row)
Also make note of the CONFIG_VERSION, as it will come into play when crafting the following query example. In my case, I happen to be using the version 3 format of the query history database.
Assuming history collection is configured, and that you have access to the history database, you can get the information you're looking for from the tables and views in that database. These are documented here. The following is an example, which reports when the given table was the target of a successful insert, update, or delete by referencing the "usage" column. Here I use one of the history table helper functions to unpack that column.
SELECT FORMAT_TABLE_ACCESS(usage),
hq.submittime
FROM "$v_hist_queries" hq
INNER JOIN "$hist_table_access_3" hta
USING (NPSID, NPSINSTANCEID, OPID, SESSIONID)
WHERE hq.dbname = 'PROD'
AND hta.schemaname = 'ADMIN'
AND hta.tablename = 'TEST_1'
AND hq.SUBMITTIME > '01-01-2015'
AND hq.SUBMITTIME <= '08-06-2015'
AND
(
instr(FORMAT_TABLE_ACCESS(usage),'ins') > 0
OR instr(FORMAT_TABLE_ACCESS(usage),'upd') > 0
OR instr(FORMAT_TABLE_ACCESS(usage),'del') > 0
)
AND status=0;
FORMAT_TABLE_ACCESS | SUBMITTIME
---------------------+----------------------------
ins | 2015-06-16 18:32:25.728042
ins | 2015-06-16 17:46:14.337105
ins | 2015-06-16 17:47:14.430995
(3 rows)
You will need to change the digit at the end of the $v_hist_table_access_3 view to match your query history version.

Is there a termination character for bq in interactive mode? How do I set it?

I've just started using bigquery and I'm used to writing SQL across multiple lines. However, if I run
bq shell
to get into interactive mode, I can't put a query in that runs across multiple lines without bq reporting an error, as it evaluates the first line of the instruction and then complains there's no FROM or GROUP BY clauses.
In other database clients, I can set a termination character: eg in DB2,
db2 -t
allows me to run db2 with commands terminated with ;
Is there a way to run bq with a termination character for each statement? I've looked at https://developers.google.com/bigquery/bq-command-line-tool and although it refers to global flags, I don't see a reference to termination characters.
After delving in the source code for bq, I can confirm there's not such termination character that allows you to do multi-line queries.
It's a consequence of the cmd module on which bq shell is built upon.
As an alternative you could run queries directly from your shell with bq query YOUR QUERY as the shell allows multi-line commands when enclosed in double quotes (").
Example:
bq query "SELECT station_number, year, month, day
FROM [publicdata:samples.gsod]
LIMIT 10"
+----------------+------+-------+-----+
| station_number | year | month | day |
+----------------+------+-------+-----+
| 42420 | 2007 | 5 | 20 |
| 42080 | 2007 | 5 | 5 |
| 152990 | 1990 | 3 | 26 |
| 543110 | 1976 | 10 | 24 |
| 740430 | 1966 | 11 | 30 |
| 228540 | 1949 | 9 | 23 |
| 747809 | 2009 | 7 | 17 |
| 681120 | 1997 | 2 | 15 |
| 26070 | 2008 | 12 | 27 |
| 128430 | 1988 | 9 | 22 |
+----------------+------+-------+-----+

What is location of built in SQL functions and Oracle Packages in Oracle Database

I want to know location of file / table where definition of Built In Functions / Packages / Procedures of Oracle are stored like MAX(), DBMS_OUTPUT etc.
In the PL/SQL engine, the Oracle supplied functions such as MAX() are part of the package STANDARD in the SYS schema.
Most other supplied packages reside in the SYS schema, however you can find out where any individual package is located quite easily - for example:
SELECT *
FROM all_objects
WHERE object_name = 'DBMS_OUTPUT'
Results:
| OWNER | OBJECT_NAME | SUBOBJECT_NAME | OBJECT_ID | DATA_OBJECT_ID | OBJECT_TYPE | CREATED | LAST_DDL_TIME | TIMESTAMP | STATUS | TEMPORARY | GENERATED | SECONDARY | NAMESPACE | EDITION_NAME |
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
| SYS | DBMS_OUTPUT | (null) | 4972 | (null) | PACKAGE | August, 27 2011 08:22:22+0000 | August, 27 2011 08:22:22+0000 | 2011-08-27:08:22:22 | VALID | N | N | N | 1 | (null) |
| PUBLIC | DBMS_OUTPUT | (null) | 4973 | (null) | SYNONYM | August, 27 2011 08:22:22+0000 | August, 27 2011 08:22:22+0000 | 2011-08-27:08:22:22 | VALID | N | N | N | 1 | (null) |
The following documentation page lists off most (if not all) PL/SQL supplied packages:
http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/intro.htm#BABGEDBH
The scripts to create the build-in functions, packages and procedures are stored on the database server machine. You have to find the value of the environment variable $ORACLE_HOME, and then go to $ORACLE_HOME/rdbms/admin/. Just use grep to find the file you're looking for.
If the database server is a Windows machine, look at ECHO %ORACLE_HOME% at the command prompt and proceed from there.