I used to call Google Translate TTS to download an audio file using this url:
http://translate.google.com/translate_tts?tl=en&q=Hello+world!
However Google changed the way that works and therefore I can no longer download the audio files.
I've signed up for a free trial for Google Translate API V2, but can't find how to get the TTS audio files.
Any idea?
You can use that link without captcha..
https://translate.google.com/translate_tts?ie=UTF-8&tl=tr-TR&client=tw-ob&q=Batsın+bu+dünya+bitsin+bu+rüya
I stumbled across this thread and wanted to give my take on it, with reference to #Alexandre Andrade, mainly because he didn't submit any code.
I did this in a react app, but the same procedure should works for a vanilla web project.
I did add the meta tag to my head public/index.html,
<head>
...
<meta name="referrer" content="no-referrer">
...
</head>
Then added the audio tag in my component:
Javascript:
const playTTS = (text, lang) => {
// Get the audio element
const audioEl = document.getElementById('tts-audio');
const url= `https://translate.google.com/translate_tts?ie=UTF-8&tl=${lang}&client=tw-ob&q=${text}`;
// add the sound to the audio element
audioEl.src = url;
//For auto playing the sound
audioEl.play();
};
html
...
<audio controls id="tts-audio"/>
...
Then it's just a matter of hooking the function up to some of your life cycle methods. Since I wrote my react code in react hooks, I added the function call in one of my hooks to get it initialized when the component was loaded. (this would be in the componentDidMount() function otherwise).
Hope this helps anyone out!
try this link for English:
https://translate.google.com/translate_tts?ie=UTF-8&client=tw-ob&tl=en&q=Hello+World
For Chinese (Puthonghua)
https://translate.google.com/translate_tts?ie=UTF-8&client=tw-ob&tl=zh-CN&q=世界+你好
Text-to-speech was always an 'unofficial' API which is now captcha-protected to prevent abuse. It was never advertised as part of the Translate API, and currently there is no TTS functionality in the Translate V2 API, paid or otherwise.
There is some more background on the following groups thread which had been ongoing for some time.
Here's to those who have desperately been trying to play Google TTS as an audio in HTML: let me save you a couple of hours of time and tell you how to do it.
Let's say we have this link:
https://translate.google.com/translate_tts?ie=UTF-8&client=tw-ob&tl=en&q=I+love+coffee
If you try to play this audio given the link and using <audio>, <iframe>, using third-party libraries or playing it with Javascript...
var audio = new Audio('https://translate.google.com/translate_tts...');
audio.play();
...then you'll soon find out that none of the aforementioned ways work as Error 404 is being thrown.
Solution
Apparently, the only possible way to play this TTS generic audio is to utilise <embed> tag wrapped into a custom <iframe> and giving the link a unique version number (it is important, as caching by browsers prevents the audio from playing for some reason).
Here is the solution for our example: (assuming you have an iframe#ttsiframe)
function playTTS(lang,sentence) {
//get the iframe
var iFrame = document.getElementById('ttsiframe');
//remove its sandbox property
iFrame.removeAttribute('sandbox');
//this is your reference variable for the iframe body and head tag
var iFrameBody;
//get the body
if (iFrame.contentDocument) { // FF
iFrameBody = iFrame.contentDocument.getElementsByTagName('body')[0];
iFrameHead = iFrame.contentDocument.getElementsByTagName('head')[0];
}
else if (iFrame.contentWindow) { // IE
iFrameBody = iFrame.contentWindow.document.getElementsByTagName('body')[0];
iFrameHead = iFrame.contentWindow.document.getElementsByTagName('head')[0];
}
else {
iFrameBody = iFrame.contentDocument.body;
iFrameHead = iFrame.contentDocument.head;
}
//generate link to Google Translate TTS using arguments (pay attention to random version number at the end)
var link = 'https://translate.google.com/translate_tts?ie=UTF-8&client=tw-ob&tl=' + lang + '&q=' + sentence.replace(/ /g,'+').replace(/[.]/g,'') + '&rd=' + getRandomInt(0,50000000);
//add embed element with our link
iFrameBody.innerHTML = '<embed src="' + link + '" id="TTS">';
//isolate iframe
iFrame.setAttribute('sandbox','');
}
you can simply use the link:
Text to Speech
Related
I wanted write some script to read subtitles in a HTML5 video and provide a different view for them (outside the videoplayer). A sample HTML5 video is here: https://www.youtube.com/watch?v=DNi60fyYjCA.
The subtitles seem to be built into the video though, there are no elements on the page. I thought I might be able to get at these subtitles via the cuechange event, or HTMLVideoElement properties track or textTracks. However I'm getting anything (.track and .textTracks are empty, no cuechange event is fired on the video element).
Is it possible for me to read the subtitles built into the video when using the HTML5 player? I am using chrome in this case, but could use a different browser.
gracias
YouTube displays the Captions/Subtitles a in a span element which is located in the player.
If you want to have a look for yourself look through the document this way:
body -> .body-container -> #page-container -> #page -> #player -> #player-mole-container -> #player-api -> .html5-video-player -> .ytp-player-content -> #caption-window-0 -> .caption-window-transform -> span.captions-text -> "subtitles"
(note from frank:
I put using MutationObserver to monitor the captions, the below code seems to work once captions are enabled (by pressing .ytp-subtitles-button):
// select the target node
var target = document.querySelector('.captions-text');
var video = document.querySelector('video');
// create an observer instance
var observer = new MutationObserver(function(mutations) {
mutations.forEach(function(mutation) {
console.log(mutation.type);
console.log(video.currentTime);
console.log(target.innerText);
});
});
// pass in the target node, as well as the observer options
observer.observe(target, {
attributes: true
});
)
I am attempting to visualize audio coming out of an element on a webpage. The source for that element is a WebRTC stream connecting to an Asterisk call via sip.js. The audio works as intended.
However, when I attempt to get the frequency data using web audio api, it returns an array of all 0's, even though the audio is working. This seems be a problem with createMediaElementSource. If I call getUserMedia and use createMediaStreamSource to connect my microphone to the input, I indeed get the frequency data returned.
This was attempted in both Chrome 40.0 and Firefox 31.4. In my search I found similar errors with Android Chrome but my versions of desktop Chrome and Firefox seem like they should be functioning correctly. So far I have a feeling that the error may be due to the audio player getting it's audio from another AudioContext in sip.js, or something having to do with CORS. All of the demos that I have tried work correctly, but only use createMediaStreamSource to get mic audio, or use createMediaElementSource to play a file (rather than streaming to an element).
My Code:
var context = new (window.AudioContext || window.webkitAudioContext)();
var analyser = context.createAnalyser();
analyser.fftSize = 64;
analyser.minDecibels = -90;
analyser.maxDecibels = -10;
analyser.smoothingTimeConstant = 0.85;
var frequencyData = new Uint8Array(analyser.frequencyBinCount);
var visualisation = $("#visualisation");
var barSpacingPercent = 100 / analyser.frequencyBinCount;
for (var i = 0; i < analyser.frequencyBinCount; i++) {
$("<div/>").css("left", i * barSpacingPercent + "%").appendTo(visualisation);
}
var bars = $("#visualisation > div");
function update() {
window.requestAnimationFrame(update);
analyser.getByteFrequencyData(frequencyData);
bars.each(function (index, bar) {
bar.style.height = frequencyData[index] + 'px';
console.debug(frequencyData[index]);
});
};
$("audio").bind('canplay', function() {
source = context.createMediaElementSource(this);
source.connect(analyser);
update();
});
Any help is greatly appreciated.
Chrome doesn't support WebAudio processing of RTCPeerConnection output streams (remote streams); see this question. Their bug is here.
Edit: they now support this in Chrome 50
See the test code for firefox about to land as part of this bug:
Bug 1081819. This bug will add webaudio input to RTCPeerConnections in Firefox; we've had working WebAudio processing of output MediaStreams for some time. The test code there tests both sides; note it depends a lot on the test framework, so just use it as a guide on hooking to webaudio.
I have dynamically created a html canvas image. I want to post it to facebook user's wall via javascript sdk. What I am doing is
I tried to convert canvas into a javascript image object & provide finalimage in fb.ui method
var temp = canvas.toDataURL("image/png");
var finalimage=temp.replace(/^data:image\/(png|jpg);base64,/, "");
FB.ui(
{
method: 'feed',
name: 'The Facebook SDK for Javascript',
caption: 'Bringing Facebook to the desktop and mobile web',
description: (
'A small JavaScript library that allows you to harness ' +
'the power of Facebook, bringing the user\'s identity, ' +
'social graph and distribution power to your site.'
),
link: 'https://developers.facebook.com/docs/reference/javascript/',
picture: finalimage
},
function(response) {
if (response && response.post_id) {
alert('Post was published.');
} else {
alert('Post was not published.');
}
}
);
But I am getting an error
Error Message: picture URL is not properly formatted
Can anyone please help how do I do that??
You need to save the image at your server for doing that, your image must be outside Facebook, because FB.ui "FEED" does not accept base64 encoded data as picture, its impossible .
Its very large data, and if you try to short the base64 code into a url shorteer, you will find difficulties too .
What you can do, and what i am doing, is use an PHP server to get the Base64 data, and convert it into image, like a dynamic image, ok? But a memory problem will be created .
Another option is (really) posting the canvas as an image on user profile, so you will need use FB.api for that, instead of FB.ui, feed dialog is used for sharing links, and, canvas doesnt have Title, Description, proprieties .
After posting the image, you will get an response, with ID, that is post ID, and this cannot be used on Feed as Link or Image proprietity, but you can use Share Dialog, if you want to share the post .
That is piece of cake, i done that, and recommend you do the same .
Posting Canvas as images usin API on Facebook Platform is great, you can do that using JavaScript SDK .
I am a complete novice with HTML5 and coding for that matter. I have been trying to get to grips with the web audio API. I want a sound to play at a click of a button. I used a tutorial posted on HTML5Rocks, but cannot get it to work. I have tried to use jfiddle to help me troubleshoot, but to no avail.
here is my code:
http://jsfiddle.net/ue8WP/
Try this:
function playsound()
{
var filepath='sounds/'+....+'.mp3'; //example
var audio = new Audio();
audio.src = filepath;
audio.controls = true;
audio.autoplay = true;
}
Here's a simplified version of your code that should work fine. You only need to load some other sample sound, since there is a problem with fetching the one you provided (not allowed by Access-Control-Allow-Origin).
http://jsfiddle.net/WB6Pw/3/
I am trying to make a flash banner in CS4 with AS3.
In this banner I have to embed youtube videos.
My problem is.. after the video loaded I cant have/see usual controls (fullscreen, pause, stop, etc) on the video.. and the video has the autoplay by default.
I am using this code:
Security.allowDomain("*");
Security.allowDomain("www.youtube.com");
Security.allowDomain("youtube.com");
Security.allowDomain("s.ytimg.com");
Security.allowDomain("i.ytimg.com");
var my_player1:Object;
var my_loader1:Loader = new Loader();
my_loader1.load(new URLRequest("http://www.youtube.com/apiplayer?version=3"));
my_loader1.contentLoaderInfo.addEventListener(Event.INIT, onLoaderInit);
function onLoaderInit(e:Event):void{
addChild(my_loader1);
my_player1 = my_loader1.content;
my_player1.addEventListener("onReady", onPlayerReady);
}
function onPlayerReady(e:Event):void{
my_player1.setSize(200,100);
/////////////////////////////////
//this example is with parameter//
//my_player1.loadVideoByUrl("http://www.youtube.com/v/ID-YOUTUBE?autohide=1&autoplay=0&fs=1&rel=0",0);
//////////////////////////////////
// this one is only the video id//
my_player1.loadVideoByUrl("http://www.youtube.com/v/ID-YOUTUBE",0);
}
I was trying to pass the parameter in the url to try but seems to be is not working.
I was checking too the google API for AS3 (http://code.google.com/apis/youtube/flash_api_reference.html) but honestly I dont find the way to implement that I need.
Whats is the way to see this controls in the video??
Thank you :)
I was trying different thing and I found a partial solution that i want to share with:
Security.allowDomain("www.youtube.com");
Security.allowDomain("youtube.com");
Security.allowDomain("s.ytimg.com");
Security.allowDomain("i.ytimg.com");
Security.allowDomain("s.youtube.com");
var my_player1:Object;
var my_loader1:Loader = new Loader();
//before I used that:
//my_loader1.load(new URLRequest("http://www.youtube.com/apiplayer?version=3"));
//Now is use this:
my_loader1.load(new URLRequest("http://www.youtube.com/v/ID_VIDEO?version=3));
my_loader1.contentLoaderInfo.addEventListener(Event.INIT, onLoaderInit);
function onLoaderInit(e:Event):void{
addChild(my_loader1);
my_player1 = my_loader1.content;
my_player1.addEventListener("onReady", onPlayerReady);
}
function onPlayerReady(e:Event):void{
my_player1.setSize(200,100);
my_player1.loadVideoByUrl("http://www.youtube.com/v/ID_VIDEO",0);
}
Basically Instead of use the "Loading the chromeless player" I use the "Loading the embedded player"
My problem now is How I can modify for example the size of the controls bar.. because is taking 35px height and I want to reduce it
Thank