input {
twitter {
consumer_key => "--"
consumer_secret => "-"
oauth_token => "--"
oauth_token_secret => "--"
keywords => ["innovation"]
full_tweet => true
}
}
filter {
}
output{
stdout
{
codec => dots
}
elasticsearch {
hosts => ["localhost:9200"]
index => "innotweets"
}
}
This is the config file to access tweets from twitter and create index in elasticsearch. It works well when I am not using VPN . When I execute this behind a proxy server, I am not able to create index in elasticsearch. What must I do to get past the proxy??
Related
Using the authentication plugin, I'm unable to verify my credentials via json. I can use Form, Jwt and Basic(For testing it works). The error returned is 'FAILURE_CREDENTIALS_INVALID'. Here's the sample code "simplified for brevity"
public function token()
{
$result = $this->Authentication->getResult();
if ($result->isValid()) {
//new jwt token etc
$message = true;
$this->log($result->getStatus());
}else{
$message = false;
$this->log($result->getStatus());
}
$this->set([
'message' => $message,
]);
$this->viewBuilder()->setOption('serialize', ['message']);
}
My application class has the method
public function getAuthenticationService(ServerRequestInterface $request) : AuthenticationServiceInterface
{
$service = new AuthenticationService();
$service->setConfig([
'unauthenticatedRedirect' => '/users/login',
'queryParam' => 'redirect',
]);
$fields = [
'username' => 'email',
'password' => 'password'
];
// Load the authenticators, you want session first
$service->loadAuthenticator('Authentication.Session');
$service->loadAuthenticator('Authentication.Form', [
'fields' => $fields,
'loginUrl' => '/users/login'
]);
//Testing jwt
$service->loadAuthenticator('Authentication.Jwt', [
'secretKey' => Security::getSalt()]);
// Load identifiers
$service->loadIdentifier('Authentication.Password', compact('fields'));
$service->loadIdentifier('Authentication.JwtSubject');
return $service;
}
The app is serving some json well if I pass in the jwt, but somehow I can't figure out how to request a new one, when the old expires.
Here's my middlewareQueue:
//some code
$middlewareQueue
->add(new ErrorHandlerMiddleware(Configure::read('Error')))
->add(new AssetMiddleware([
'cacheTime' => Configure::read('Asset.cacheTime'),
]))
->add(new RoutingMiddleware($this))
->add($csrf)
->add(new BodyParserMiddleware())
->add(new AuthenticationMiddleware($this));
//more code
I DO NOT use basic
As mentioned in the comments, if you want to authenticate using the form authenticator on your token retrieval endpoint, then you need to make sure that you include the URL path of that endpoint in the authenticators loginUrl option.
That option accepts an array, so it should be as simple as this:
$service->loadAuthenticator('Authentication.Form', [
'fields' => $fields,
'loginUrl' => [
'/users/login',
'/api/users/token.json',
],
]);
The error that you were receiving was because the form authenticator simply wasn't applied on the token endpoint, and therefore the authentication service would go to the next authenticator, the JWT authenticator, which isn't bound to a specific endpoint, and therefore can run for all endpoints, and the JWT authenticator of course expects a different payload, it looks for a token, and if it can't find it, it will return the error status FAILURE_CREDENTIALS_INVALID.
This is my part of logstash.conf:
output {
stomp {
host => "localhost"
port => "61613"
destination => "/queue/test"
user => "admin"
password => "admin"
headers => {
"persistent" => true
}
}
stdout {}
}
Now I want send message to ActiveMQ with SSL. What should I do?
Based on this PR from the logstash-plugins project it appears that SSL/TLS is not supported with Stomp.
I am building an intranet site that uses core web api as the backend and angular as the front end. Because the way the db and overall project structure was written, I have an unconventional way of authorizing users. I am grabbing the windows login name (not using identity or any login page), then comparing it to a list of authorized users I have in my db. I got the authorization handler working, however I am stuck on finding a way to prevent my policy from redirecting to a login page (none will exist). Instead of redirecting, I want to just get the 401 status code, so I can use Angular to do a notification
I've done various searches on google/ stack overflow, all the examples and solutions use either identity or token policies, I am not going that route, I am only using a fake cookie auth just to get my authorization policy to work
public void ConfigureServices(IServiceCollection services)
{
services.AddAutoMapper();
services.AddScoped<IChecklistRepository, ChecklistRepository>();
services.AddCors(o => o.AddPolicy("Angular", b=>
{
b.AllowAnyHeader().AllowAnyMethod().AllowAnyOrigin();
}));
services.AddAuthentication(CookieAuthenticationDefaults.AuthenticationScheme)
.AddCookie(CookieAuthenticationDefaults.AuthenticationScheme,
opt =>
{
opt.LoginPath = null;
opt.AccessDeniedPath = null;
// Does not do anything
});
services.AddDbContext<SWAT_UpdateChecklistsContext>(opt => opt.UseMySql(Configuration.GetConnectionString("conn")));
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1).AddJsonOptions(o =>
{
o.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
});
services.AddSpaStaticFiles(configuration =>
{
configuration.RootPath = "ClientApp/dist";
});
services.AddAuthorization(opt =>
{
opt.AddPolicy("AccessUser", policy => {
policy.Requirements.Add(new UserAccess());
});
});
services.AddTransient<IAuthorizationHandler, AuthorizedUser>();
}
I did a little more digging and think I found my answer,
services.AddAuthentication(CookieAuthenticationDefaults.AuthenticationScheme)
.AddCookie(CookieAuthenticationDefaults.AuthenticationScheme,
opt =>
{
opt.Events.OnRedirectToLogin = ctx =>
{
ctx.Response.StatusCode = StatusCodes.Status401Unauthorized;
return Task.CompletedTask;
};
opt.Events.OnRedirectToAccessDenied = ctx =>
{
ctx.Response.StatusCode = StatusCodes.Status403Forbidden;
return Task.CompletedTask;
};
});
When I want to retrieve ParseObjects (Posts) from parse db (mongo) to display in my Android app, I need to add new fields to ParseObject in the cloud code before delivering to client. these fields are only necessary for the client and thus I do not want to save them to the cloud/db. but for some weird reason it seems like the additional fields are only delivered to client successfully if I save them to cloud.
something like this will work:
Parse.Cloud.define("getPosts", function(request, response){
const query = new Parse.Query("Post");
query.find()
.then((results) => {
results.forEach(result => {
result.set("cloudTestField", "this is a testing server cloud field");
});
return Parse.Object.saveAll(results);
})
.then((results) => {
response.success(results);
})
.catch(() => {
response.error("wasnt able to retrieve post parse objs");
});
});
this delivers all new fields to my client.
but if I don't save them to db and just add them prior to client delivery
something like this:
Parse.Cloud.define("getPosts", function(request, response){
const query = new Parse.Query("Post");
query.find()
.then((results) => {
results.forEach(result => {
result.set("cloudTestField", "this is a testing server cloud field");
});
response.success(results);
})
.catch(() => {
response.error("wasnt able to retrieve post parse objs");
});
});
Then for some reason, In my android studio (client log), I receive null on the key "cloudTestField"
ParseCloud.callFunctionInBackground("getPosts", params,
new FunctionCallback<List<ParseObject>>(){
#Override
public void done(List<ParseObject> objects, ParseException e) {
if (objects.size() > 0 && e == null) {
for (ParseObject postObj : objects) {
Log.d("newField", postObj.getString("cloudTestField"));
}
} else if (objects.size() <= 0) {
Log.d("parseCloudResponse", "sorry man. no objects from server");
} else {
Log.d("parseCloudResponse", e.getMessage());
}
}
});
and for some reason, the output is:
newField: null
How do I add fields to ParseObjects in cloud without saving to db
Turnes out, you cannot add fields whom are not persistent - to ParseObject.
So I needed to convert the parseObjects to Json and now it's working like a charm:
Parse.Cloud.define("getPosts", function(request, response){
const query = new Parse.Query("Post");
var postJsonList = [];
query.find()
.then((results) => {
results.forEach(result => {
var post = result.toJSON();
post.cloudTestField = "this is a testing server cloud field";
postJsonList.push(post);
});
response.success(postJsonList);
})
.catch(() => {
response.error("wasnt able to retrieve post parse objs");
});
});
I've got the following logstash config, and I'm trying to send the RabbitMQ headers (which are stored in the #metadata field) to ElasticSearch
input {
rabbitmq {
auto_delete => false
durable => false
host => "my_host"
port => 5672
queue => "my_queue"
key => "#"
threads => 1
codec => "plain"
user => "user"
password => "pass"
metadata_enabled => true
}
}
filter {
???
}
output {
stdout { codec => rubydebug {metadata => true} }
elasticsearch { hosts => localhost }
}
I can see the headers in the std output
{
"#timestamp" => 2017-07-11T15:53:28.629Z,
"#metadata" => {
"rabbitmq_headers" => { "My_Header" => "My_value"
},
"rabbitmq_properties" => {
"content-encoding" => "utf-8",
"correlation-id" => "785901df-e954-4735-a9cf-868088fdac87",
"content-type" => "application/json",
"exchange" => "My_Exchange",
"routing-key" => "123-456",
"consumer-tag" => "amq.ctag-ZtX3L_9Zsz96aakkSGYzGA"
}
},
"#version" => "1",
"message" => "{...}"
Is there some filter (grok, mutate, kv, etc.) which can copy these values to Tags in the message sent to ElasticSearch?