I run a webcam streaming site, and has a gallery of images saved every hour every 24 hours, the thumbnails dont seem to change much (?) so wondered if there was a way to stop caching using a javascript . . . script? This is what im dealing with:
I'm not having this issue with Mozilla Firefox, but it seems that the results of the code below are being cached in EI 8. When the browser makes another asynchronous call to the server with different results from the database, those results are not being displaying on the wepage. How do I fix this problem. As I mentioned above the code works fine in Mozilla Firefox. The page displays data in EI 8 with no errors five seconds after the page has loaded, but as the information in the database changes, I'm not seeing those changes reflected on the page in 5 second intervals.
function reload2(){ $.get( 'vcci.php', function($xml) { // Make the XML query-able with jQuery $xml = $($xml); var $iso2 = $xml.find('data').text(); //alert($iso2); $('#para').text($iso2); }, 'xml' );}
I'm not having this issue with Mozilla Firefox, but it seems that the results of the code below are being cached in EI 8. When the browser makes another asynchronous call to the server with different results from the database, those results are not being displaying on the wepage. How do I fix this problem. As I mentioned above the code works fine in Mozilla Firefox. The page displays data in EI 8 with no errors five seconds after the page has loaded, but as the information in the database changes, I'm not seeing those changes reflected on the page in 5 second intervals. I'm assuming this is a caching issue, if so how do I fix it.
I have a function which caches images for a slideshow once the script has loaded. Here is an exerpt:
var base_ref = "http://www.example.com"; var images = new Array(); var tmp_images = new Array(); tmp_images[0] = "image1.jpg"; tmp_images[1] = "image2.jpg"; tmp_images[2] = "image3.jpg"; tmp_images[3] = "image4.jpg"; tmp_images[4] = "image5.jpg";
function cache_images () { for (var i=0; i < tmp_images.length; i++){ var cacheimage=new Image(); var tmp_name = tmp_images[i]; var url = tmp_images[i] + ".png"; cacheimage.src=url; images[tmp_name]=cacheimage; } }
As it is, the script whizzes through the for loop, and then moves on to the next function. The only trouble is that it moves on to the next function whilst it's still downloading the images from the server. Is there anyway to make the script wait until the downloading is completed, so that I can then go on to resize the images if it's needed?
I have noticed that when you set the src attribute of an img via script, (instead of hard-coding in the img tag), although it caches the images, if you leave the site and return, the browser has to re-cache the images. Is that just the way it is, or is there a way to not lose the cache?
I'm using a little javascript script I came across for a simple slideshow and it works great. However I want it to stop rotating images when it comes to the end of the slideshow and just stay on the last image. Is there a simple way to edit this to tell it to do that?
A small problem has come up and I'm and I'm not sure how to resolve it. The problem is I do not want my images to preload in an image gallery on one page. [URL]. The images in the gallery you can see are preloading. How can I stop them from preloading? What would I need to do? I just need them to load whenever the user clicks on one. Preloading just takes up uneccessary time. Here is the Javascript code: [URL]. What should I do? I do not know javascript very well.
have created ecommerce site in zen 1.3.9h using a template by bling themes called "digital shop". sadly we cannot get any support. I have image handler 2 installed.my slider is producing blurry images and i need to know how to set the variables in the js script to stop it stretching the images. please view my problem.[url].....my pictures in other areas of the site are perfect and resolution is acceptable. i am sure that problems are due to slider.I found two JavaScript files that control the slider, they are in my - storeincludes emplatesdigitalshopjscript
first code; this.each(function() { var obj = $(this); [code]....
If I use javascript to 'read' and XML file does that mean that the XML file gets downloaded to the user's cache?
I'm building a quiz and I'd rather not have the answers too easily available. I believe there's no real way to secure anything client side with javascript so I'm just trying to keep the curious at bay.
I have a oc4j application server hosting my application . I can see from tcp monitor for eg: GET /scripts/main.js and the server responds with last-modified timestamp. However for subsequent request IE does not use if-modified-since and further for each script in Local Settings is getting cached as main[1].js, main[2].js etc and also in different directories. This does not happen for all script but if a page has some 15 scripts included the last 5 scripts show this behaviour.
I had an incident in which I uploaded a swf file with an incorrect URL. Well, when I noticed the issue I uploaded a 'new' file - but those who have previously seen the site kept seeing this "older .swf".Without a huge proces. renaming files, modifying html files etc.. Is there a way to customize a meta tag or some other mechanism to have a '.swf' file NOT CACHE. It has to not cache in all browsers.. ala: ie, ff, safari
I'm passing the asp parameters using the url current page is files.asp and I'm using window.location.href=files.asp?action=deletefile to pass the action to the serverside
My code never got executed (like the page was cached) unless i put document.write("") before the window.location directive.
Here's the code:
function confirmDelete(x){ var potvrda=confirm("Kliknite OK za brisanje. Cancel za povratak."); if (potvrda==true) { trans="files.asp?action="+x; document.write("") window.location.href=trans; } else {} }
I've created a page that uses HTTPRequest to include some XML data and allow the user to update that data. The problem is that the new data doesn't show up, even though the XML file is changed.
I can call the XML file up in a separate browser window, where I get the old data, refresh to get the new data, then when I refresh the first browser, the new data appears on the page.
I'm thinking this might be some sort of server caching issue. Has anyone else run into this? Does anyone have a solution?
This example will show that the first call to "doit()" will print 30 times the image with only one call to the server. Then I have a timeout and call doit() again, and this time, it downloads the picture 30 times!!!! It doesn't even time to finish downloading the pictures before the next timeout kicks off and if I let it go for a minute or two i have like 300 calls to download the same image trying to be downloaded!
I heard about a bug in IE that would require to preload the images using a hidden div, but that didn't work. I see that google maps manage to not having to reload the image and gets it from cache. What am I doing wrong? This works perfectly in Firefox (i.e. it uses the cache and calls the server just once). Code:
Can anyone help figure out why my javascript preloaders aren't preloading! Code:
the navigation section utilses simple js rollovers on the text options, with the onMouseRollover event loading an additional graphic image to the right of the menu. It appears almost instantaneously on broadband, but as I'm testing it on 56k and I'm trying to appease all users I required the navigation images to preload (especially those giving the description of the option).
They appear to preload fine in Netscape7 but not in IE6???
I won't list the entire source here apart from the following sections:: Code:
I m trying to make a person stay on same page on cancel, but the confirm takes the user to the next page like they press ok. how can i stop it? code...
We have a dynamically created javascript menu (from ASP), which is customised per user (Have already taken all the static code out into separate cached .js file)
The size of the 'dynamic' menu content can be as much as 10kB, and the menu typically does not change for the duration of the user's session - i.e. it would be nice to get the browser to 'cache' this. It is an Intranet application, and is typically aimed at IE6 clients only.
Have considered the following strategies
1) Cookies - although the last thing I want is the whole menu coming back to the server on every HTTP request - but would be useful IF there is e.g. a header option the cookie to 'send' the cookie (Server -> Browser) without the browser ever sending it back to the server (but the browser still being able to 'read' the cookie?)
2) Creating 'dynamic' javascript files - i.e. send the output per user to a mangled .js file (e.g. with a session ID in the filename), into a cached js file. Would however need to cleanup the files quite regularly, and giving IUSR file creation access doesn't seem a good idea. Would then get the browser to include the JS by generating ASP along the lines of. <script language="JavaScript" src="TempScripts/Menu<%=UserSession%>.js"></script>
Is there any other way?
Second Question : Is there any way to get IE to stop sending up the HTTP REFERER header up to the server (e.g. RegKey) - this is pretty pointless on an Intranet App (I know there is a way to do in NetScape).
I have a textarea element, and a link action to empty the textarea.
$('.area').empty();
Normally it works fine. But after a ajax call to send the textarea value, the empty() stops working. I also tried text(), html(), none of them are working. The original text in the textarea still stays there. I have no way changing its value. My guess is firefox caches it or something.
I am using vs2010 and when I hit ctr + F5 it launches which ever broswer I select to browse with and tells it not to use the cached version of that web page. But regardless if its chrome, firefox,IE, opera or safari,they all seem tohit a wall and stop recognizing my changes code tweaks. How do I resolve this frustration! It is very trouble some when testing .json file changes and css changes.
It looks like IE is caching the response for some AJAX requests here. The app I'm working on is a catalog of sorts. Clicking the link for a category loads a set of "items". Those, in turn, may be deleted from the admin view. The delete works fine (I checked the DB) but, when loading the same set of category items, I'm seeing the list unchanged. That is, the thing that was deleted is still there. Sorry for the crap explanation but I'm not really sure of a better way of putting it. Bottom line is, has anyone seen this sort of behavior with IE before? Can it cache the result of an AJAX request like that? And, if so, how can I guard against that? Can I set cache-control headers for an AJAX request?
I am working on a web app that pulls content using multiple JSON files. I have tried numerous methods of parsing the JSON, but only the following has worked for me.
Ultimately I would like to somehow store the contents of the JSON files locally and swap out as new content is available. Unfortunately, I don't know how to do that. And possibly a method of using a more current version of jQuery to parse my JSON files?
I have a complex JS object. It manipulates the page DOM. Inside it I have many repetitive selections spread across functions. I would like to cache the selectors to enhance performance. How do other people do this? I can cache on the function level but not the object. When I try to cache on the object level I seemingly end up with stale selectors that while defined don't actually work.
My gut is that I can only cache on the function level?
what I have read does not deal with caching inside a JS object.