I need to do something unusual. Say a user is browsing my site, and before they navigate away, I need to perform some actions like logging them out, for which they would need some data from the server. On onunload, a SJAX request (AJAX with that parameter as false) is issued, code flow hangs, and when the result comes in, code flow resumes and does what it needs to. If AJAX were used, the script would unload and never get the callback. Ok. So that's what I want. The problem is that I'm dealing with something other than an XMLHttpRequest object, and I can't simply supply a false parameter to the request function. I'm looking for some way to wrap an asynchronous routine of this object into a synchronous one. If call var result = myobject.get(data_to_send), code flow here hangs WHILE myobject calls this funky XMLHttpRequest-like object in asynchronous mode, waits for the callback, the callback is called, and myobject returns the result. So code stops synchronously outside of myobject.get, but waits asynchronously inside.
My work is putting in a large application that is basically split up between 30 or so Javascript files. I have some security concerns about this application.
Basic security concerns is:
1. Possible SQL injection and other forms of injection attacks on URLS of various server side components javascript accesses.
2. possible client side database access.
3. Incorrect use of http get for operations with possible side effects.
The security problems are probably relatively harmless. Mainly because the application should be running behind firewall.
However I would like to have an analysis tool that can go over the javascript code and allow me to see what urls are being called with what parameters.
Javascript that writes new javascript into page (so I can get all javascript files of application for analysis)
I know there are various javascript profilers and the like, anything out there that helps in the analysis of this kind of application?
what happens when you have an onclick event and an error occurs in it:
In an <a> element: onclick="zoomFullExtent(); return false;"
I know that there is an error happening in zoomFullExtent. I didn't define my own error handler, so the default one is used.(My browser is Firefox 1.0).
I notice that when this error happens, the browser makes a request to the server.
I thought that if an error happened in zoomFullExtent, the default error handler would catch it, and then zoomFullExtent would return normally. But that doesn't seem to happen. Instead the whole onclick script returns or is aborted? And it seems to return true so that the request is made. Is there a page where this program flow is explained?
I am really new to coding and am trying to include a flowchart that I have created on my site. There is a series of 5 questions with 2 possible answers for each question. Can anyone help me with some coding that would let me show only the first question and based on how they answer that, the flow chart would open up to the second question and once they answer the second question, it opens up to the third question and so on?
I've a div that is editable (contentEditable = true). The div has a fixed size like a letter page. When the user has written so much text that the text overflows I want add a new div above and let flow the overflown text in the other div. (it's the some behavior like MS Word in page view, but now it is in the web)
It is possible to do something like this in the web?
One way to implement this is to check if the text of the div overflows (I've found java script examples in the web which do that). And than I need the text that overflows to move it to the next div. But I have not found a function that do this. Is there such a function?
Another way is to insert a gap at that position a new page begins and use an background image that looks like a page border. With this solution, the whole text is in one div but it looks like floating to another page. But therefore, I have to add a gap between to lines at a specific position (the page border/margin and the gab between the pages). Does anyone have an idea how to realize this?
Am working on a form, which has around 10 fields and 1 field for the captcha. Assuming that the user enters the wrong captcha code, am trying to use Ajax to ensure that the other field information isnt lost due to form submission.
I have also written a validation script, to ensure that all the field values have been entered.
I have written the ajax script, and am using this:
Code:
xmlhttp.open("GET",url, false);
The reason I am making a sync call is, depending on whether the user has entered the correct code, I am going to submit the form using javascript.
With IE, this works fine, but with FF it doesn't. Is there a workaround for this? If so, how?
The entire code for reference:
Code:
var xmlhttp function showHint(str) { if (str.length==0) {
I need to create a callback for a line of code that performs asynchronous work so that another a line of code can be called after it is finished. I've found a number of webpages that attempt to show how this can be done with two functions, one calling the other in Russian-doll fashion, but I can't see how to do it with my code.It takes the URL of a sound file, redefines a previously defined embed to point at that sound file, and then plays it. The problem is that I lose focus on the documentElement that was selected before the playIt() function is called. So in the playIt() function I save the focused element in a variable and focus() it after the embed-switcheroo and autoplay is performed. This doesn't work, because the "e.parentNode.replaceChild(clone, e);" is performed asynchronously; when it finishes, it clears the focus in the document (internal id's have changed? Reason unknown.) So I need the focus() code to follow the replaceChild() code. I want to accomplish this by having the focus() code execute as a callback following the replaceChild() code. How would I break this into two procedures, one calling the other, reproducing a synchronous flow?
An experiment i'm doing requires requires a synchronous cross-domain request, without using a proxy. I wondered if anyone had any ideas to help me achieve this.
Below is what I have tried, including my conclusions/assumptions (which i'll happily be corrected on if it solves my problem!):
The requirement not to use a proxy means I can't use the synchronous mode of XMLHttpRequest, as it will not let me go cross-domain.
On-demand loading of javascript enables me to achieve the cross-domain request by loading javascript of the form:
callback(data);
which on loading calls callback(), but it is not obvious how to make this synchronous. I've also managed to get the same effect using a hidden IFRAME, but again it relies on a callback. Is there a good way to wrap/transform this in to a synchronous request? Code:
I'd like to process several blocks of parallel actions, but in a sequential manner.
As an example:
Thus, I want to process blocks, from which I don't know how long they will take and afterwards have a couple of actions, before beginning with another block. I already tried it through using .queue, .ready() etc, but that leads to very ugly or unusable code..
I am using the awesome malsup cycle plugin, but it seems like the sync: true option isn't working 100% synchronous. If I activate the sync option and let exactly the same images fade, a short opacity effect occurs. $(document).ready(function() { $('.slideshow').cycle({ fx: 'fade', sync: true, speed: 1000, timeout: 4000 });}); <div class="slideshow"> <img src="[URL]" width="200" height="200" /> <img src="[URL]" width="200" height="200" /> <img src="[URL]" width="200" height="200" /> <img src="[URL]" width="200" height="200" /> <img src="[URL]" width="200" height="200" /> </div> I would like to fade some images with only few differences, and for the user it should look like only those few parts of the image change.
I have an ajax call that I want to display an ajax loader image before it makes the call and then hide that image after it completes the call. The below code is working fine in FF. But when the code is run in IE, for some reason the ajax call is made first and then the image isn't displayed until after the ajax call has completed. I've tried putting the .show() method before the ajax call and even in the beforeSend option of the ajax call, for some reason IE STILL makes the ajax call, and waits for it to complete, before it displays the image.
I have been using asynchronous requests for a long time. so the response was processed in a callback function. I thought of not using async so i made synchronous requests. The reason is that i dont have to have two more lines for checking the status and the onreadystatechange.... my synchronous requests would be like this...
[Code]....
so from the code you can understand that there is no need of a callback function and if conditions to check ready state... ... So there is no problem in the above code. The problem araises here... if i press the F5 (refresh) key or do a page refresh when the process is waiting for the response i get an (NS ERROR - firefox ( i have not yet checked that in IE browsers) (javascript error) though the process completes successfully. why?
Do we have to check whether the page is navigating away while in synchronous operation and abort the request? or what could be the reason for the error. This will not happen in async requests because that is also the reason for async...
I'm trying to make a little loop that in each itteration executes a little php script to send mail with the mail() function. the php script returns either succes or failure. Now its my intention to append that msg to a div, after every execution.
the show status(result) is nothing more then a .append(result) the zenMails(y) calls this function again to send the next mail. It does work though but it only updates the div after the entire loop is done, i 'ld think it ld do so after every execution since i call the showstatus when the synchronous call is done and only after that i call the next iteration. Is there anyway to work aroud this ? (making the call assync doesnt work , because the port for sending the mail would still be in use)
I have verified that the server is returning valid json. My jsonpCallback is called, but neither the success nor the error callbacks are. I'm stumped by this.
how to use jQuery I created my own AutoComplete Textbox (although there is one in the jQueryUI Lib). The users types something in a textbox and after 3 characters, the getJSON method is beeing called. This processes the data by calling another site and after received the data, it displays the result in a div tag.If a user types something, it automatically searchs - my problem is the delay. Because if a user continuous, he will already receive (delayed) results - so far it works as designed. My question is, how can I stop the other callbacks of being processed and only receive and display the details of the last callback?
Really excited about the new 1.4 release. Looks like a lot of good stuff. Anyway, I've been using the json format to pass data back from the server and after upgrading to 1.4 I'm getting the parsererror, even when I simplify my response to something like:
In php I've been using:
To set the header.
Guess I'm stuck using the older version until I find a solution.
This could appear under Ajax, for example, you could have multiple objects that make a singular ajax call (say an RPC-like request) and you need to update the object that made the call during the callback with the result, but it doesn't have to be. The particular problem I'm thinking about happens to be ajax, particularly with multiple objects accessing the same ajax request (meaning I can't use a global or temporary variable).
One way that seems like it would work (just thinking about it in my head), is to create a hash, and to pass the key through the request, store the key in the response, and pick it up on the callback side. Then remove the item from the hash when done.
Now I want to call a method a number of times from with the class and when they have all finished I want a second method to be called, I don't want the methods themselves to be altered, i.e. I want this to be generic.
Here was my idea:
var timerCounter = 0 function CallFunc(func, callbackFunc) { timerCounter--; if(timerCounter == 0) { callbackFunc() } }
The documentation on e.g. the fadeIn() method does not specify any constraints on the callback argument. Does that mean that 'anything will work'? Specifically, is recursion allowed?I want to know whether jQuery design has deliberately taken this into account. Yes, I can read the source code. No I don't plan to do so (for now), since I consider it an essential gap in the documentation.[code]
I am trying to develop a iGoogle-like dashboard that uses JSONP to get the content of each widget from other (trusted) sites.Each widget is a div that will take care of getting its content using $.ajax() and use the callback to update the div with the html content returned with JSONP.The problem I have happens only in Firefox (I'm using 3.6.3):when a site is unavailable or takes longer to return the JSONP content for a widget, it seems that the callback for the other widget does not get executed. All the widgets stay in the "loading" state although I'm sure they have all received the answer.What's puzzling is that if I hit the "stop" button of Firefox, the content of the other widgets get displayed (ie. their displaying callbacks get executed).
If the calls are asynchronous, what prevents Firefox from executing the callbacks for the other widgets once the response is received?Do you have any idea of what's happening and if there's some way around this?