JQuery :: Autocomplete With Remote And Caching?
Mar 31, 2011How much data can be cached in the browser? My records seem to be about 70 characters from roughly 7 columns in my database.
View 4 RepliesHow much data can be cached in the browser? My records seem to be about 70 characters from roughly 7 columns in my database.
View 4 RepliesI'm using jQuery $.ajax or $.getJSON on document.ready to access data from the server through coldfusion remote cfc files.
My url looks something like this for the main page [url]
When i go to a new page i.e. [url]
The ajax functions are called but are not collecting new data (only in IE). IE caches the response and wont give me a new one unless I refresh the page, which I don't want to do.
Here's an example of a function I call. I've also tried $.getJSON with no success.
function getGrains() {
I've looked into ways to make sure IE wont cache files. Some of the popular suggestions are:
Add a random parameter to the url: I've done this:
Change get requests to posts: Done (see function)
Also, jQuery's cache:false doesn't work.
I'm not sure I can use the meta tags or header cache-control functions because it's not caching my html page, it just caches these HTTPRequests from within my javascript file, and they're not loading html, just json objects.
I've started working with the jQuery UI 1.8 Autcomplete recently with remote data. I was pleased with the ease of initial implementation, but before long I ran into a classic race condition.As I type, the search query gets more specific, so it takes less time for the server respond. As a consequence, it is possible for the old response to arrive after the most recent one. Obviously, this is producing undesirable effects.I'm a little frustrated that the plugin doesn't have a way to manage this. To me, it seems that it makes the "basic" remote data implementation unreliable in most real-world situations. It also seems as though this would be a common problem, but I've found very little literature on it.
I've found that autocomplete is a relatively new addtion to jQuery UI, so I've put the frustration aside and started my own widget which extends autocomplete to solve for three things: 1) race conditions; 2) animated open/close; 3) caching.The nature of this post is two-fold. Not only to share information I've gathered on the topic of race conditions with others who might be having the same trouble, but to (hopefully) gain some insight to how other people are solving for this.The first piece of this puzzle was that $.ajax() (and related methods) return an XMLHTTPRequest instance. As described at stackoverflow.com, this grants us the ability to use XMLHTTPRequest.abort() method. So I just keep a handle to the XMLHTTPRequest instance, and if it exists, call abort() before the next request is made.Using a firebug, I could see that the requests were being aborted as expected and the symptoms of the race-condition ceased. So far, so good... then I got to IE. Not so much.In IE, I was seeing run-time errors. The odd thing is that the run-time errors seemed to be coming from deep within jQuery UI, rather than my code yet commenting out the abort() avoided the run-times. After scratching my head for a while, I used the following simplified code to shed some light on the situation:[code]
In Firefox and Chrome, behavior is as expected - no alert box. However, in IE6 and IE8 (IE7 untested at this time), the success handler still fires! As it turns out, the run-time errors were because the response was undefined as it got passed through my success code path. My next thought was, "maybe I can just evaluate textStatus". Unfortunately, it turns out (as seen in the alert box) it contains the string "success".
I am using the jquery autocomplete pugin fromhttp://bassistance.de/jquery-plugins/jquery-plugin-autocomplete/
I need to perform an action if no data is returned from the autocomplete search but there seems sto be no way to do that. Any ideas pls?
Does anyone have a library or patch to call a handler if a user leaves an autocomplete field without choosing one of the autocomplete options - i.e. they've entered free text. I'm working with an app that populates multiple fields from a single auto-complete value, and our latest requirement is to clear out a bunch of fields if the user's entered something manually - rejecting autocomplete suggestions. My initial attempts at hooking into onkeyfoo and onblur haven't lead anywhere productive, and I'm hoping someone else has managed to overcome the gnarly event and timing dependencies involved with onkeyfoo and blur being used for standard autocomplete behaviour.
View 1 Replies View RelatedI am new to javascript, I have one question, how can I use javascript to get the IP address of the remote user or remote web browser?
View 9 Replies View RelatedI have found two jquery plugins and i am trying to combine an action but to no avail. what i want to do is after selecting an item from the auto complete box i would like for it then to do a change function and retrieve details. Here are my 2 pieces of code.
[Code]...
I have a textarea element, and a link action to empty the textarea.
$('.area').empty();
Normally it works fine. But after a ajax call to send the textarea value, the empty() stops working. I also tried text(), html(), none of them are working. The original text in the textarea still stays there. I have no way changing its value. My guess is firefox caches it or something.
I am using vs2010 and when I hit ctr + F5 it launches which ever broswer I select to browse with and tells it not to use the cached version of that web page. But regardless if its chrome, firefox,IE, opera or safari,they all seem tohit a wall and stop recognizing my changes code tweaks. How do I resolve this frustration! It is very trouble some when testing .json file changes and css changes.
View 3 Replies View RelatedIt looks like IE is caching the response for some AJAX requests here. The app I'm working on is a catalog of sorts. Clicking the link for a category loads a set of "items". Those, in turn, may be deleted from the admin view. The delete works fine (I checked the DB) but, when loading the same set of category items, I'm seeing the list unchanged. That is, the thing that was deleted is still there. Sorry for the crap explanation but I'm not really sure of a better way of putting it. Bottom line is, has anyone seen this sort of behavior with IE before? Can it cache the result of an AJAX request like that? And, if so, how can I guard against that? Can I set cache-control headers for an AJAX request?
View 2 Replies View RelatedI am using a jQuery script to insert iframe in the document after thepage load completes as follows:
jQuery(window).load(function() {
var container = jQuery(#container_id);
jQuery(<iframe id=my_iframe
[code]....
I am working on a web app that pulls content using multiple JSON files. I have tried numerous methods of parsing the JSON, but only the following has worked for me.
$(function() { $(document).ready(function() {
$.getJSON("new.json",function(data) {
$.each(data.posts, function(i,data){ var div_data ='<h4>'+data.title+'</h4><p>'+data.description+'</p>';
[Code].....
Ultimately I would like to somehow store the contents of the JSON files locally and swap out as new content is available. Unfortunately, I don't know how to do that. And possibly a method of using a more current version of jQuery to parse my JSON files?
I have a complex JS object. It manipulates the page DOM. Inside it I have many repetitive selections spread across functions. I would like to cache the selectors to enhance performance. How do other people do this? I can cache on the function level but not the object. When I try to cache on the object level I seemingly end up with stale selectors that while defined don't actually work.
My gut is that I can only cache on the function level?
what I have read does not deal with caching inside a JS object.
I'm not having this issue with Mozilla Firefox, but it seems that the results of the code below are being cached in EI 8. When the browser makes another asynchronous call to the server with different results from the database, those results are not being displaying on the wepage. How do I fix this problem. As I mentioned above the code works fine in Mozilla Firefox. The page displays data in EI 8 with no errors five seconds after the page has loaded, but as the information in the database changes, I'm not seeing those changes reflected on the page in 5 second intervals. I'm assuming this is a caching issue, if so how do I fix it.
[Code]...
If I use javascript to 'read' and XML file does that mean that the XML
file gets downloaded to the user's cache?
I'm building a quiz and I'd rather not have the answers too easily
available. I believe there's no real way to secure anything client side
with javascript so I'm just trying to keep the curious at bay.
I have a oc4j application server hosting my application . I can see
from tcp monitor for eg: GET /scripts/main.js and the server responds
with last-modified timestamp. However for subsequent request IE does
not use if-modified-since and further for each script in Local Settings
is getting cached as main[1].js, main[2].js etc and also in different
directories. This does not happen for all script but if a page has some
15 scripts included the last 5 scripts show this behaviour.
I had an incident in which I uploaded a swf file with an incorrect URL. Well, when I noticed the issue I uploaded a 'new' file - but those who have previously seen the site kept seeing this "older .swf".Without a huge proces. renaming files, modifying html files etc.. Is there a way to customize a meta tag or some other mechanism to have a '.swf' file NOT CACHE. It has to not cache in all browsers.. ala: ie, ff, safari
View 2 Replies View RelatedI'm passing the asp parameters using the url current page is files.asp and I'm using window.location.href=files.asp?action=deletefile to pass the action to the serverside
My code never got executed (like the page was cached) unless i put document.write("") before the window.location directive.
Here's the code:
function confirmDelete(x){
var potvrda=confirm("Kliknite OK za brisanje. Cancel za povratak.");
if (potvrda==true) {
trans="files.asp?action="+x;
document.write("")
window.location.href=trans;
}
else {}
}
var xmlfileLoaded = xmlDoc.load(xmlFileName);
and we noticed that this files are not cached in numbers of clients Internet
Exporer v6.
We see in httprequest log this request.
We also tried to gzip the request but probably xmlDoc not allowed it.
I've created a page that uses HTTPRequest to include some
XML data and allow the user to update that data. The
problem is that the new data doesn't show up, even though
the XML file is changed.
I can call the XML file up in a separate browser window,
where I get the old data, refresh to get the new data, then
when I refresh the first browser, the new data appears on
the page.
I'm thinking this might be some sort of server caching
issue. Has anyone else run into this? Does anyone have a
solution?
This example
will show that the first call to "doit()" will print 30 times the image
with only one call to the server. Then I have a timeout and call doit()
again, and this time, it downloads the picture 30 times!!!! It doesn't
even time to finish downloading the pictures before the next timeout
kicks off and if I let it go for a minute or two i have like 300 calls
to download the same image trying to be downloaded!
I heard about a bug in IE that would require to preload the images
using a hidden div, but that didn't work. I see that google maps manage
to not having to reload the image and gets it from cache. What am I
doing wrong? This works perfectly in Firefox (i.e. it uses the cache
and calls the server just once). Code:
I have a function which caches images for a slideshow once the script has loaded. Here is an exerpt:
var base_ref = "http://www.example.com";
var images = new Array();
var tmp_images = new Array();
tmp_images[0] = "image1.jpg";
tmp_images[1] = "image2.jpg";
tmp_images[2] = "image3.jpg";
tmp_images[3] = "image4.jpg";
tmp_images[4] = "image5.jpg";
function cache_images () {
for (var i=0; i < tmp_images.length; i++){
var cacheimage=new Image();
var tmp_name = tmp_images[i];
var url = tmp_images[i] + ".png";
cacheimage.src=url;
images[tmp_name]=cacheimage;
}
}
As it is, the script whizzes through the for loop, and then moves on to the next function. The only trouble is that it moves on to the next function whilst it's still downloading the images from the server. Is there anyway to make the script wait until the downloading is completed, so that I can then go on to resize the images if it's needed?
I run a webcam streaming site, and has a gallery of images saved every hour every 24 hours, the thumbnails dont seem to change much (?) so wondered if there was a way to stop caching using a javascript . . . script? This is what im dealing with:
<!-- timeshots -->
<div id="Html4" style="position:absolute;overflow:auto;left:286px;top:557px;width:455px;height:180px;z-index:10">
<a href="[URL]" rel="lightbox" title="GardenCam at 12:00am"><img src="[URL]" /></a>
<a href="[URL]" rel="lightbox" title="GardenCam at 1:00am"><img src="[URL]" /></a>
<a href="[URL]" rel="lightbox" title="GardenCam at 2:00am"><img src="[URL]" /></a> .....
Can anyone help figure out why my javascript preloaders aren't preloading! Code:
the navigation section utilses simple js rollovers on the text options, with the onMouseRollover event loading an additional graphic image to the right of the menu. It appears almost instantaneously on broadband, but as I'm testing it on 56k and I'm trying to appease all users I required the navigation images to preload (especially those giving the description of the option).
They appear to preload fine in Netscape7 but not in IE6???
I won't list the entire source here apart from the following sections:: Code:
We have a dynamically created javascript menu (from ASP), which is
customised per user (Have already taken all the static code out into
separate cached .js file)
The size of the 'dynamic' menu content can be as much as 10kB, and the
menu typically does not change for the duration of the user's session
- i.e. it would be nice to get the browser to 'cache' this. It is an
Intranet application, and is typically aimed at IE6 clients only.
Have considered the following strategies
1) Cookies - although the last thing I want is the whole menu coming
back to the server on every HTTP request - but would be useful IF
there is e.g. a header option the cookie to 'send' the cookie (Server
-> Browser) without the browser ever sending it back to the server
(but the browser still being able to 'read' the cookie?)
2) Creating 'dynamic' javascript files - i.e. send the output per user
to a mangled .js file (e.g. with a session ID in the filename), into a
cached js file. Would however need to cleanup the files quite
regularly, and giving IUSR file creation access doesn't seem a good
idea. Would then get the browser to include the JS by generating ASP
along the lines of.
<script language="JavaScript"
src="TempScripts/Menu<%=UserSession%>.js"></script>
Is there any other way?
Second Question : Is there any way to get IE to stop sending up the
HTTP REFERER header up to the server (e.g. RegKey) - this is pretty
pointless on an Intranet App (I know there is a way to do in
NetScape).
I have noticed that when you set the src attribute of an img via script,
(instead of hard-coding in the img tag), although it caches the images,
if you leave the site and return, the browser has to re-cache the images.
Is that just the way it is, or is there a way to not lose the cache?