I am getting nextPageToken from json result but still didn't work. Not no where problem exist, can anyone plz look into this and rectify and help me in this issue.
`
$jsonURL = file_get_contents("https://www.googleapis.com/youtube/v3/search?order=date&part=snippet&contentDetail&channelId= {$channelID}&type=video&maxResults=2&key={myAPiKey}");
$result = json_decode($jsonURL, true);
foreach ($result['items'] as $searchResult) {
$nPt = $result['nextPageToken'];
$Tmburl = $searchResult['snippet']['thumbnails']['high']['url'];
$views = $searchResult['statistics']['viewCount'];
$date = $searchResult['snippet']['publishedAt'];
echo '<h3>'.$searchResult['snippet']['title'].'</h3>';
echo '<img src="'.$Tmburl.'"/>';
echo '<p>'.$searchResult['snippet']['description'].'</p>';
}
// Pagination
echo '<h1>next</h1>';
`
The Next - Previous Pagination not working. Even i m getting nextPageToken number but link not functional.
I got, Just send pageToken value in Json URL, and pagination work.
Related
I was using a simple code which includes a yahoo api code to get just the weather from my city and put in on my web page, however, i just read that yahoo public api is no longer working and i dont know how can a i get this code to work, i have a yahoo account, i created an api and i dont know how to proceed since here. If somebody can help me this is the code:
<?php
/*Clima*/
if(isset($_POST['zipcode']) && is_numeric($_POST['zipcode'])){
$zipcode = $_POST['zipcode'];
}else{
$zipcode = 'ARMA0056';
}
$result = file_get_contents('http://weather.yahooapis.com/forecastrss?p=' . $zipcode . '&u=c');
$xml = simplexml_load_string($result);
//echo htmlspecialchars($result, ENT_QUOTES, 'UTF-8');
$xml->registerXPathNamespace('yweather', 'http://xml.weather.yahoo.com/ns/rss/1.0');
$location = $xml->channel->xpath('yweather:location');
if(!empty($location)){
foreach($xml->channel->item as $item){
$current = $item->xpath('yweather:condition');
$forecast = $item->xpath('yweather:forecast');
$current = $current[0];
$clima = <<<END
<span>{$current['temp']}°C</span>
END;
}
}else{
$clima = '<h1>No results found, please try a different zip code.</h1>';
}
/*Clima*/
?>
just replace http://weather.yahooapis.com with http://xml.weather.yahoo.com. credits to https://forum.rainmeter.net/viewtopic.php?f=13&t=23010
xml.weather.yahoo.com was the solution, but the URL does not seem to be working anymore. Im now using yahoos query to get the XML i.e."https://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20weather.forecast%20where%20woeid%3D2489314"
This seems to be the same XML with the exception of "results" added to the tree.
Yahoo recently updated the way they handle requests. It used to be just over any connection, but now to make it more secure and easier to handle, they recently opted into sending all requests over OAuth1. Use the sample code they provide on their page and get the information from the request over JSON.
See https://developer.yahoo.com/weather/ for more information.
YAHOO changed some rules about api;
I made following class working for me... hope works for you;
$fcast=$phpObj->query->results->channel->item->forecast; change this line for other items...
<?php
date_default_timezone_set('CET');
class weatherfc{
public $result;
function weather($city){
$BASE_URL = "http://query.yahooapis.com/v1/public/yql";
$yql_query = 'select * from weather.forecast where woeid in (select woeid from geo.places(1) where text="'.$city.'") and u="c"';
$yql_query_url = $BASE_URL . "?q=" . urlencode($yql_query) . "&format=json";
// Make call with cURL
$session = curl_init($yql_query_url);
curl_setopt($session, CURLOPT_RETURNTRANSFER,true);
$json = curl_exec($session);
// Convert JSON to PHP object
$phpObj = json_decode($json);
//var_dump($phpObj);
$weatherd='<div> Weather In '.$city.'<br>';
$fcast=$phpObj->query->results->channel->item->forecast;
foreach($fcast as $witem){
$fdate=DateTime::createFromFormat('j M Y', $witem->date);
$weatherd.= '<div class="days">';
$weatherd.= '<div class="item"><div>'.$fdate->format('d.m').' '.$witem->day.'</div><div class="image" style="width:90px !important; height:65px !important;"><img src="http://us.i1.yimg.com/us.yimg.com/i/us/nws/weather/gr/'.$witem->code.'d.png" width=90></div></div>';
$weatherd.= '<div><span>'.$witem->high.'°C</span>';
$weatherd.= '<span>'.$witem->low.'°C</span></div></div>';
};
$this->result=$weatherd;
}
}
$h= new weatherfc;
$h->weather("Antalya,Turkey");
echo $h->result;
?>
<style>
.days{
width:90px;
font-size:12px;
float:left;
font-family:Arial, Helvetica, sans-serif;
border:#999 1px dotted;
}
</style>
I need help with youtube API V3.
as when on the browser I type:
https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=1&q=Titanic%201997%20Official%20Trailer&key=
it's show return values.
However I am trying to collect the array from php.
How can I use GET https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=1&q=Titanic%201997%20Official%20Trailer&key= in php so I get the result in php?
or is there any option to get the search result in a rss feed format.
Thanks in advance
Make a simple video search on YouTube API and you will get video title, description, video id and image source so best approach full code is given below.
<?php
error_reporting(0);
$search = "Search Query"; // Search Query
$api = "YouTube API Key"; // YouTube Developer API Key
$results = 10; // Max results
$link = "https://www.googleapis.com/youtube/v3/search?safeSearch=moderate&order=relevance&part=snippet&q=".urlencode($search). "&maxResults=$results&key=". $api;
$video = file_get_contents($link);
$video = json_decode($video, true);
foreach ($video['items'] as $data){
$title = $data['snippet']['title'];
$description = $data['snippet']['title'];
$vid = $data['id']['videoId'];
$image = "https://i.ytimg.com/vi/$vid/default.jpg";
// Output Title/Description/Image URL If Video ID exist
if($vid){
echo "Title: $title<br />Description: $description<br />Video ID: $vid<br />Image URL: $image<hr>";
}
}
?>
I want to create a drop down option using Magento module that populate the data from the database I created.
Previously, I have this code in My IndexController.php which is work. This is the first code.
public function dropdownAction() {
if (file_exists('./app/etc/local.xml')) {
$xml = simplexml_load_file('./app/etc/local.xml');
$tblprefix = $xml->global->resources->db->table_prefix;
$dbhost = $xml->global->resources->default_setup->connection->host;
$dbuser = $xml->global->resources->default_setup->connection->username;
$dbpass = $xml->global->resources->default_setup->connection->password;
$dbname = $xml->global->resources->default_setup->connection->dbname;
}
else {
exit('Failed to open ./app/etc/local.xml');
}
$link = mysql_connect($dbhost,$dbuser,$dbpass);
mysql_select_db($dbname) or die("Unable to select database");
$tblname = $tblprefix.'my_db_table';
$result = mysql_query("SELECT dropdowndata FROM ".$tblname."");
echo '<select>';
while ($ary = mysql_fetch_array($result)){
echo "<option>" . $ary['dropdowndata '] . "</option>";
}
echo "</select>";
mysql_close($link);
}
But I think the code above is not the Magento way. Do you agree?
Now, I want to populate the data with this code in IndexController.php. This is the second code.
public function dropdownAction() {
$options= Mage::getModel('my/model')->getCollection();
foreach($options as $option){
$optionData = $option->getDropdowndata ();
echo "<select>";
echo "<option>" .$optionData."</option>";
echo "</select>";
}
}
Using the code above, the data was populated but one data with one drop down option. So there are so many drop down options appear on the browser, each drop down option will contain only one data.
I think I am missing the while ($ary = mysql_fetch_array($result)). But I confuse how to include that code?
So, my question is how to do mysql_fetch_array in Magento? Or can somebody please explain how to make the second code above work like the first code.
getData() function returns an array of the whole data, and of course need move 'select' nodes out of the foreach
echo "<select>";
foreach($options as $option){
$optionData = $option->getData();
echo "<option>" .$optionData['somekey'] ."</option>";
}
echo "</select>";
But I think would be better use the magento magic functions, for example if you have 'entity_id' column in DB you can get value using $option->getEntityId(), etc...
And why do you have select inside of foreach? I think something like this will solve your problem:
public function dropdownAction() {
$options= Mage::getModel('my/model')->getCollection();
echo "<select>";
foreach($options as $option){
$optionData = $option->getDropdowndata ();
echo "<option>" .$optionData."</option>";
}
echo "</select>";
}
I'm sure that this is likely a simple solution, but I can't see my error. I'm making an API call to youtube to get some basic information on a youtube video using the video's ID; specifically what I want is the (1) title, (2) description, (3) tags, and (4) thumbnail.
When I load an API url via my web browser, I see all the data. I don't want to paste the entire response in this question, but paste the following url in your browser and you'll see what I see: http://gdata.youtube.com/feeds/api/videos/_83A00a5mG4
If you look closely you'll see media:thumbnail, media:keywords, content, etc. Everything I want is there. Now the troubles...
When I load this same url through the following functions (which I copied from the Vimeo api...), the thumbnail and keywords simply aren't there.
function curl_get($url) {
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, 30);
$return = curl_exec($curl);
curl_close($curl);
return $return;
}
// youtube_get_ID is defined elsewhere...
$request_url = "http://gdata.youtube.com/feeds/api/videos/" . youtube_get_ID($url);
$video_data = simplexml_load_string(curl_get($request_url));
These functions do give me a response with some data, but the keywords and thumbnail are missing. Could anyone please tell me why my thumbnail and keywords are missing? Thank you for any help!
Here's the link for documentation on this.
Here's a function I wrote:
function get_youtube_videos($max_number, $user_name) {
$xml = simplexml_load_file('http://gdata.youtube.com/feeds/api/users/' . $user_name . '/uploads?max-results=' . $max_number);
$server_time = $xml->updated;
$return = array();
foreach ($xml->entry as $video) {
$vid = array();
$vid['id'] = substr($video->id,42);
$vid['title'] = $video->title;
$vid['date'] = $video->published;
//$vid['desc'] = $video->content;
// get nodes in media: namespace for media information
$media = $video->children('http://search.yahoo.com/mrss/');
// get the video length
$yt = $media->children('http://gdata.youtube.com/schemas/2007');
$attrs = $yt->duration->attributes();
$vid['length'] = $attrs['seconds'];
// get video thumbnail
$attrs = $media->group->thumbnail[0]->attributes();
$vid['thumb'] = $attrs['url'];
// get <yt:stats> node for viewer statistics
$yt = $video->children('http://gdata.youtube.com/schemas/2007');
$attrs = $yt->statistics->attributes();
$vid['views'] = $attrs['viewCount'];
array_push($return, $vid);
}
return $return;
}
And here's the implementation:
$max_videos = 9;
$videos = get_youtube_videos($max_videos, 'thelonelyisland');
foreach($videos as $video) {
echo $video['title'] . '<br/>';
echo $video['id'] . '<br/>';
echo $video['date'] . '<br/>';
echo $video['views'] . '<br/>';
echo $video['thumb'] . '<br/>';
echo $video['length'] . '<br/>';
echo '<hr/>';
}
I'd like to be able to run a script that parsed through the twitter page and compiled a list of tweets for a given time period - one week to be more exact. Ideally it should return the results as a html list that could then be posted in a blog. Like here:
http://www.perezfox.com/2009/07/12/the-week-in-tweet-for-2009-07-12/
I'm sure there's a script out there that could do it, unless the guy does it manually (that would be a big pain!). If there is such a script forgive my ignorance.
Thanks.
Use the Twitter search API. For instance, this query returns my tweets between 2009-07-10 and 2009-07-17:
http://search.twitter.com/search.atom?q=from:tormodfj&since=2009-07-10&until=2009-07-17
For anyone that's interested, I hacked together a quick PHP parser that will take the XML output of the above feed and turn it into a nice list. It's sensible if you post a lot of tweets to use the rpp parameter, so that your feed doesn't get clipped at 15. The maximum limit is 100. So by sticking this url into NetNewsWire (or equivalent feed reader):
http://search.twitter.com/search.atom?q=from:yourTwitterAccountHere&since=2009-07-13&until=2009-07-19&rpp=100
and exporting the xml to a hard file, you can use this script:
<?php
$date = "";
$in = 'links.xml'; //tweets
file_exists($in) ? $xml = simplexml_load_file($in) : die ('Failed to open xml data.');
foreach($xml->entry as $item)
{
$newdate = date("dS F", strtotime($item->published));
if ($date == "")
{
echo "<h2>$newdate</h2>\n<ul>\n";
}
elseif ($newdate != $date)
{
echo "</ul>\n<h2>$newdate</h2>\n<ul>\n";
}
echo "<li>\n<p>" . $item->content ." *</p>\n</li>\n";
$date = $newdate;
}
echo "</ul>\n";
?>