I am trying to use the cryptsy.com's API to get the current price of doge. This is my code.
package main;
import java.text.DecimalFormat;
import java.util.Date;
import java.util.concurrent.TimeUnit;
import main.Cryptsy.CryptsyException;
import main.Cryptsy.PublicMarket;
public class Main {
public static void main (String [] args) throws CryptsyException, InterruptedException{
Cryptsy cryptsy = new Cryptsy();
while(true){
PublicMarket[] markets = cryptsy.getPublicMarketData();
for(PublicMarket market : markets) {
DecimalFormat df = new DecimalFormat("#.########");
if(market.label.equals("DOGE/BTC"))
System.out.println(new Date() + " " + market.label + " " + df.format(market.lasttradeprice));
}
TimeUnit.SECONDS.sleep(30);
}
}
}
the problem is that the price get updated too rear (30 mins or something) and only if I restart my program. Anyone to know how to get the current price? Also there is connection errors sometimes.
Actually the connection problems are normal with the Cryptsy API. It's slow and often disconnects without an answer. They are overcrowded like all the times.
There is a new API location that should be faster and solve the connection issies, here:
http://pubapi.cryptsy.com/api.php?method=marketdatav2
And also, if you are only interested in one single currency, you can get the market data of only that currency. The whole Answer from Cryptsy for all Currencies is like 300k, so you would waste bandwith, if you poll that every minute or so.
For only one currency it will be like:
http://pubapi.cryptsy.com/api.php?method=singlemarketdata&marketid={MARKET ID}
Where the market ID can be gathered inside the answer of the first URL. But you just need the int ID of the market once, from then on you can always use the direct call..
Every Detail is BTW available here:
https://www.cryptsy.com/pages/api
Related
I'm writing client server applications on top of netty.
I'm starting with a simple client login server that validates info sent from the client with the database. This all works fine.
On the client-side, I want to use If statements once the response is received from the server if the login credentials validate or not. which also works fine. My problem is the ChannelRead method does not return anything. I can not change this. I need it to return a boolean which allows login attempt to succeed or fail.
Once the channelRead() returns, I lose the content of the data.
I tried adding the msg to a List but, for some reason, the message data is not stored in the List.
Any suggestions are welcome. I'm new... This is the only way I've figured out to do this. I have also tried using boolean statements inside channelRead() but these methods are void so once it closes the boolean variables are cleared.
Following is the last attempt I tried to insert the message data into the list I created...
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
import java.util.ListIterator;
public class LoginClientHandler extends ChannelInboundHandlerAdapter {
Player player = new Player();
String response;
public volatile boolean loginSuccess;
// Object message = new Object();
private Object msg;
public static final List<Object> incomingMessage = new List<Object>() {
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
// incomingMessage.clear();
response = (String) msg;
System.out.println("channel read response = " + response);
incomingMessage.add(0, msg);
System.out.println("incoming message = " + incomingMessage.get(0));
}
How can I get the message data "out" of the channelRead() method or use this method to create a change in my business logic? I want it to either display a message to tell the client login failed and try again or to succeed and load the next scene. I have the business logic working fine but I can't get it to work with netty because none of the methods return anything I can use to affect my business logic.
ChannelInitializer
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.socket.SocketChannel;
import io.netty.handler.codec.DelimiterBasedFrameDecoder;
import io.netty.handler.codec.Delimiters;
import io.netty.handler.codec.string.StringDecoder;
import io.netty.handler.codec.string.StringEncoder;
public class LoginClientInitializer extends ChannelInitializer <SocketChannel> {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline pipeline = ch.pipeline();
pipeline.addLast("framer", new DelimiterBasedFrameDecoder(8192, Delimiters.lineDelimiter()));
pipeline.addLast("decoder", new StringDecoder());
pipeline.addLast("encoder", new StringEncoder());
pipeline.addLast("handler", new LoginClientHandler());
}
}
To get the server to write data to the client, call ctx.write here is a basic echo server and client example from the Netty in Action book. https://github.com/normanmaurer/netty-in-action/blob/2.0-SNAPSHOT/chapter2/Server/src/main/java/nia/chapter2/echoserver/EchoServerHandler.java
There are several other good examples in that repo.
I highly recommend reading the "netty in action" book if you're starting out with netty. It will give you a solid foundational understanding of the framework and how it's intended to be used.
pipe.hset(uuid, "name", "Archie");
This is an example of how I am using the hset. There are about 10 other attributes (name, age, etc.).
I am trying to remove the entire hset, e.g. remove uuid so it is no longer a key (is key the right term?).
I have tried removing each element individually through a pipeline;
for (String s : profileData) {
pipe.hdel("profile#" + uuid.toString(), s);
}
But firstly, this has time complexity O(n) and so can be more efficient and secondly it isn't actually working for me, as the keys are still present (think this could be my own coding fault).
I've seen questions asking for a hdelall function and I know that one doesn't exist.
I also tried using,
pipe.del(uuid);
But this does nothing - so obviously I'm using it incorrectly. I assumed it would just delete the whole hset but it doesn't, it must be used to delete a single value instead? I'm unsure.
So my question boils down to;
How can I efficiently remove an entire hset from Redis, using Jedis.
Thank you.
I'm not sure how your code looks like, but I did this quick test and it worked for me as expected.
import redis.clients.jedis.JedisPool;
import redis.clients.jedis.JedisPoolConfig;
import redis.clients.jedis.Pipeline;
import java.time.Duration;
import java.util.Set;
public class TestRedisDelete {
public static void main(String[] args) {
TestRedisDelete redis = new TestRedisDelete();
Pipeline p = redis.jedisPool.getResource().pipelined();
p.hset("h1", "f", "v");
p.hset("h2", "f", "v");
p.hset("h3", "f", "v");
p.del("h1");
p.sync();
Set<String> keys = redis.jedisPool.getResource().keys("*");
System.out.println(keys);
}
final JedisPoolConfig poolConfig = buildPoolConfig();
JedisPool jedisPool = new JedisPool(poolConfig, "127.0.0.1", 6379);
private JedisPoolConfig buildPoolConfig() {
final JedisPoolConfig poolConfig = new JedisPoolConfig();
poolConfig.setMaxTotal(10);
poolConfig.setMaxIdle(10);
poolConfig.setMinIdle(4);
poolConfig.setTestOnBorrow(true);
poolConfig.setTestOnReturn(true);
poolConfig.setTestWhileIdle(true);
poolConfig.setMinEvictableIdleTimeMillis(Duration.ofSeconds(60).toMillis());
poolConfig.setTimeBetweenEvictionRunsMillis(Duration.ofSeconds(30).toMillis());
poolConfig.setNumTestsPerEvictionRun(3);
poolConfig.setBlockWhenExhausted(true);
return poolConfig;
}
}
Output: [h2, h3]
How about using the delete method of JEDIS
jedis.del(uuid);
Check this link for more details
I am checking to get NAS storage list.
I tested 2 ways, one ways is using BAP id, another way is direct account id
first
Using BAP id, get account list.
Using account id, get NAS Storage list.
==> I didn't NAS Storage list
second
Using direct account id, get NAS Storage list
===> successly, get NAS Storage list
I don't Understand difference between ways.
i attached first test code,
"getNasNetworkStorageCount" method returned NAS stroage count, but "getNasNetworkStorage" return "null".
public void test() {
String userId = "IBMxxxxx";
String apiKey = "xxxxx";
client = new RestApiClient().withCredentials(userId, apiKey).withLoggingEnabled();
Account.Service accountService = Account.service(client);
List<Brand> brandList = accountService.getOwnedBrands();
for (Brand brand : brandList) {
Brand.Service brandService = brand.asService(client);
Account.Mask mask = new Account.Mask();
mask.id();
mask.companyName();
mask.accountStatusId();
mask.email();
mask.hardwareCount();
mask.hardware();
mask.virtualGuestCount();
mask.virtualGuests();
mask.nasNetworkStorage();
mask.nasNetworkStorageCount();
brandService.clearMask();
brandService.setMask(mask);
List<Account> accountList = accountList = brandService.getOwnedAccounts();
for (Account account : accountList) {
if(account.getNasNetworkStorageCount() != 0){
System.out.print(account.getNasNetworkStorageCount() + " == ");
System.out.println(account.getNasNetworkStorage().size());
}
}
System.out.println(accountList.size());
}
}
Your results might be those because when you run the SoftLayer_Brand::getOwnedAccounts method it only returns the account for the current user (i.e. the user that’s calling the API)
You can run this Java example and see that effectively the brand retrieves the right account for the user caller, and then all NAS Network Storages that belong to it.
package SoftLayer_Java_Scripts.Examples;
import com.google.gson.Gson;
import com.softlayer.api.*;
import com.softlayer.api.service.Account;
import com.softlayer.api.service.Brand;
import com.softlayer.api.service.network.Storage;
import java.util.List;
public class GetNasNetworkStorage
{
public static void main( String[] args )
{
String user = "set me";
String apiKey = "set me";
long brandId = 2L;
ApiClient client = new RestApiClient().withCredentials(user, apiKey);
Brand.Service brandService = Brand.service(client, brandId);
try
{
List<Account> accountsList = brandService.getOwnedAccounts();
Gson gson = new Gson();
for (Account account : accountsList) {
Account.Service accountService = account.asService(client);
List<Storage> nasStorageList = accountService.getNasNetworkStorage();
for (Storage storage : nasStorageList) {
System.out.println(gson.toJson(storage));
}
}
}
catch(Exception e)
{
System.out.println("Script failed, review the next message for further details: " + e.getMessage());
}
}
}
The difference is that the Brand service is to manage brand accounts whilts using directly the account service is to manage all the information about a particular account.
Currently it may be an issue with the object mask that you are using, however the problem of use the Brand service is that this service was designed only to display the basic information of the all accounts which belong to the brand it was not designed to display all the information of the related accounts (even if you use object masks). I am going to report the issue of the object mask to softlayer, I mean the one related that the nasNetworkStorage returns null, but I already reported similar issues and they were not fix it, because as I told you that is not the propuse of the service.
You also can try setting the object mask as a string maybe that works e.g.
brandService.setMask("mask[id,companyName,accountStatusId,email,hardwareCount,hardware,virtualGuestCount,VirtualGuest,nasNetworkStorage,nasNetworkStorageCount]");
Anyway the most reliable way to get that information of your accounts associated to the brand is using the master user of each accout, I mean using the account service; even the softlayer agent portal uses the master account to get more information of a particular account in your brand.
Let me know if you have more questions
Regards
On the E-Bay web site you can search for e. g. bracelets with main color being silver (see screenshot below).
Is it possible to run such query (find newest bracelets with main color silver) programmatically (via the eBay Search API)? If yes - how?
I looked at findItemsAdvanced, but didn't find any reference to color search there.
Refining searches with the Finding API is done through Aspect Filters which are defined in the documentation as:
Aspects are well-known, standardized characteristics of an item. For
example, "Screen Size," "Processor Type," and "Processor Speed" could
be aspects of Laptop PCs. Aspects can vary for different kinds of
items. For example, the aspects of Laptop PCs are different from those
of Women's Dresses (aspects for Women's Dresses might include "Sleeve
Style," "Dress Length," and "Size").
Performing a search is a two step process.
Determine what aspects are available for the category you are searching in.
Choose what aspects you will use and integrate them as filters in your search request.
The following examples use the Finding Kit that eBay provide as part of their Java SDK.
The first example shows how to obtain the aspects that are available. This is achieved via the getHistograms operation. For this example we will use the category Jewelry & Watches > Fashion Jewelry > Rings (67681) that is found on the US eBay site.
import java.util.List;
import com.ebay.services.client.ClientConfig;
import com.ebay.services.client.FindingServiceClientFactory;
import com.ebay.services.finding.*;
public class GetHistograms {
public static void main(String[] args) {
try {
ClientConfig config = new ClientConfig();
config.setApplicationId("<YOUR EBAY APP ID>");
FindingServicePortType serviceClient = FindingServiceClientFactory.getServiceClient(config);
GetHistogramsRequest request = new GetHistogramsRequest();
request.setCategoryId("67681");
GetHistogramsResponse result = serviceClient.getHistograms(request);
AspectHistogramContainer aspectHistogramContainer = result.getAspectHistogramContainer();
List<Aspect> aspects = aspectHistogramContainer.getAspect();
for(Aspect aspect : aspects) {
System.out.println("* " + aspect.getName() + " *");
List<AspectValueHistogram> values = aspect.getValueHistogram();
for(AspectValueHistogram value : values) {
System.out.println(value.getValueName());
}
}
} catch (Exception ex) {
}
}
}
Each category can have multiple aspects that consist of a name, such as Color, and several values such as Red, White, Blue. An example of the output produced by this code is shown below.
Metal Purity
10k
14k
18k
Main Stone
No Stone
Abalone
Agate
Main Color
Aqua
Black
Blue
The names and values returned from getHistograms can now be used as filters in the findItemsAdvanced operation. For the second example we will use the Brand and Main Color aspects. This example also uses the same category as the first.
import java.util.List;
import com.ebay.services.client.ClientConfig;
import com.ebay.services.client.FindingServiceClientFactory;
import com.ebay.services.finding.*;
public class FindItem {
public static void main(String[] args) {
try {
ClientConfig config = new ClientConfig();
config.setApplicationId("<YOUR EBAY APP ID>");
FindingServicePortType serviceClient = FindingServiceClientFactory.getServiceClient(config);
FindItemsAdvancedRequest request = new FindItemsAdvancedRequest();
request.getCategoryId().add("67681");
AspectFilter aspectFilter = new AspectFilter();
aspectFilter.setAspectName("Brand");
aspectFilter.getAspectValueName().add("Paula Abdul");
aspectFilter.getAspectValueName().add("Tiffany");
aspectFilter.getAspectValueName().add("Tommy Hilfiger");
request.getAspectFilter().add(aspectFilter);
aspectFilter = new AspectFilter();
aspectFilter.setAspectName("Main Color");
aspectFilter.getAspectValueName().add("Gold");
aspectFilter.getAspectValueName().add("Silver");
request.getAspectFilter().add(aspectFilter);
PaginationInput pi = new PaginationInput();
pi.setEntriesPerPage(2); request.setPaginationInput(pi);
FindItemsAdvancedResponse result = serviceClient.findItemsAdvanced(request);
System.out.println("Found " + result.getSearchResult().getCount() + " items." );
List<SearchItem> items = result.getSearchResult().getItem();
for(SearchItem item : items) {
System.out.println(item.getTitle());
}
} catch (Exception ex) {
}
}
}
you probably want to use AspectFilters. the input for that can be found in the previous query (as stated in the documentation. see aspectHistogrammContainer)
VariationType appears to be able to do that.
I am working on a Java application which uses Bigquery as the analytics engine. Was able to run query jobs (and get results) using the code on Insert a Query Job. Had to modify the code to use service account using this comment on stackoverflow.
Now, need to run an extract job to export a table to a bucket on GoogleStorage. Based on Exporting a Table, was able to modify the Java code to insert extract jobs (code below). When run, the extract job's status changes from PENDING to RUNNING to DONE. The problem is that no file is actually uploaded to the specified bucket.
Info that might be helpful:
The createAuthorizedClient function returns a Bigquery instance and works for query jobs, so probably no issues with the service account, private key etc.
Also tried creating and running the insert job manually on google's api-explorer and the file is successfully created in the bucket. Using the same values for project, dataset, table and destination uri as in code so these should be correct.
Here is the code (pasting the complete file in case somebody else finds this useful):
import java.io.File;
import java.io.IOException;
import java.security.GeneralSecurityException;
import java.util.Arrays;
import java.util.List;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson.JacksonFactory;
import com.google.api.services.bigquery.Bigquery;
import com.google.api.services.bigquery.Bigquery.Jobs.Insert;
import com.google.api.services.bigquery.BigqueryScopes;
import com.google.api.services.bigquery.model.Job;
import com.google.api.services.bigquery.model.JobConfiguration;
import com.google.api.services.bigquery.model.JobConfigurationExtract;
import com.google.api.services.bigquery.model.JobReference;
import com.google.api.services.bigquery.model.TableReference;
public class BigQueryJavaGettingStarted {
private static final String PROJECT_ID = "123456789012";
private static final String DATASET_ID = "MY_DATASET_NAME";
private static final String TABLE_TO_EXPORT = "MY_TABLE_NAME";
private static final String SERVICE_ACCOUNT_ID = "123456789012-...#developer.gserviceaccount.com";
private static final File PRIVATE_KEY_FILE = new File("/path/to/privatekey.p12");
private static final String DESTINATION_URI = "gs://mybucket/file.csv";
private static final List<String> SCOPES = Arrays.asList(BigqueryScopes.BIGQUERY);
private static final HttpTransport TRANSPORT = new NetHttpTransport();
private static final JsonFactory JSON_FACTORY = new JacksonFactory();
public static void main (String[] args) {
try {
executeExtractJob();
} catch (Exception e) {
e.printStackTrace();
}
}
public static final void executeExtractJob() throws IOException, InterruptedException, GeneralSecurityException {
Bigquery bigquery = createAuthorizedClient();
//Create a new Extract job
Job job = new Job();
JobConfiguration config = new JobConfiguration();
JobConfigurationExtract extractConfig = new JobConfigurationExtract();
TableReference sourceTable = new TableReference();
sourceTable.setProjectId(PROJECT_ID).setDatasetId(DATASET_ID).setTableId(TABLE_TO_EXPORT);
extractConfig.setSourceTable(sourceTable);
extractConfig.setDestinationUri(DESTINATION_URI);
config.setExtract(extractConfig);
job.setConfiguration(config);
//Insert/Execute the created extract job
Insert insert = bigquery.jobs().insert(PROJECT_ID, job);
insert.setProjectId(PROJECT_ID);
JobReference jobId = insert.execute().getJobReference();
//Now check to see if the job has successfuly completed (Optional for extract jobs?)
long startTime = System.currentTimeMillis();
long elapsedTime;
while (true) {
Job pollJob = bigquery.jobs().get(PROJECT_ID, jobId.getJobId()).execute();
elapsedTime = System.currentTimeMillis() - startTime;
System.out.format("Job status (%dms) %s: %s\n", elapsedTime, jobId.getJobId(), pollJob.getStatus().getState());
if (pollJob.getStatus().getState().equals("DONE")) {
break;
}
//Wait a second before rechecking job status
Thread.sleep(1000);
}
}
private static Bigquery createAuthorizedClient() throws GeneralSecurityException, IOException {
GoogleCredential credential = new GoogleCredential.Builder()
.setTransport(TRANSPORT)
.setJsonFactory(JSON_FACTORY)
.setServiceAccountScopes(SCOPES)
.setServiceAccountId(SERVICE_ACCOUNT_ID)
.setServiceAccountPrivateKeyFromP12File(PRIVATE_KEY_FILE)
.build();
return Bigquery.builder(TRANSPORT, JSON_FACTORY)
.setApplicationName("My Reports")
.setHttpRequestInitializer(credential)
.build();
}
}
Here is the output:
Job status (337ms) job_dc08f7327e3d48cc9b5ba708efe5b6b5: PENDING
...
Job status (9186ms) job_dc08f7327e3d48cc9b5ba708efe5b6b5: PENDING
Job status (10798ms) job_dc08f7327e3d48cc9b5ba708efe5b6b5: RUNNING
...
Job status (53952ms) job_dc08f7327e3d48cc9b5ba708efe5b6b5: RUNNING
Job status (55531ms) job_dc08f7327e3d48cc9b5ba708efe5b6b5: DONE
It is a small table (about 4MB) so the job taking about a minute seems ok. Have no idea why no file is created in the bucket OR how to go about debugging this. Any help would be appreciated.
As Craig pointed out, printed the status.errorResult() and status.errors() values.
getErrorResults(): {"message":"Backend error. Job aborted.","reason":"internalError"}
getErrors(): null
It looks like there was an access denied error writing to the path: gs://pixalate_test/from_java.csv. Can you make sure that the user that was performing the export job has write access to the bucket (and that the file doesn't already exist)?
I've filed an internal bigquery bug on this issue ... we should give a better error in this situation.
.
I believe the problem is with the bucket name you're using -- mybucket above is just an example, you need to replace that with a bucket you actually own in Google Storage. If you've never used GS before, the intro docs will help.
Your second question was how to debug this -- I'd recommend looking at the returned Job object once the status is set to DONE. Jobs that end in an error still make it to DONE state, the difference is that they have an error result attached, so job.getStatus().hasErrorResult() should be true. (I've never used the Java client libraries, so I'm guessing at that method name.) You can find more information in the jobs docs.
One more difference, I notice is you are not passing job type as config.setJobType(JOB_TYPE);
where constant is private static final String JOB_TYPE = "extract";
also for json, need to set format as well.
I had the same problem. But it turned out was that I typed the name of the table wrong. However, Google did not generate an error message saying that "the table does not exists." That would have helped me locate my problem.
Thanks!