Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
813 views
in Technique[技术] by (71.8m points)

caching - C# WebClient disable cache

Good day.

I'm using the WebClient class in my C# application in order to download the same file every minute, and then the application performs a simple check to see if the file has been changed, and if it does do something with it.

Well since this file is downloaded every minute the WebClient caching system is caching the file, and not downloading the file again, just simply getting it from the cache, and that gets in the way of checking if the file downloaded is new.

So i would like to know how can disable the caching system of the WebClient class.

I've tried.

Client.CachePolicy = new System.Net.Cache.RequestCachePolicy(System.Net.Cache.RequestCacheLevel.BypassCache);

I also tried headers.

WebClient.Headers.Add("Cache-Control", "no-cache");

Didn't work as well. So how can i disable the cache for good?

Thanks.

EDIT

I also tried the following CacheLevels: NoCacheNoStore, BypassCache, Reload. No effect, however if i reboot my computer the cache seems to be cleared, but i can't be rebooting the computer every time.

UPDATE in face of recent activity (8 Set 2012)

The answer marked as accepted solved my issue. To put it simple, I used Sockets to download the file and that solved my issue. Basically a GET request for the desired file, I won't go into details on how to do it, because I'm sure you can find plenty of "how to" right here on SO in order to do the same yourself. Although this doesn't mean that my solution is also the best for you, my first advice is to read other answers and see if any are useful.

Well anyway, since this questions has seen some recent activity, I thought about adding this update to include some hints or ideas that I think should be considered by those facing similar problems who tried everything they could think off, and are sure the problem doesn't lie with their code. Likely to be the code for most cases, but sometimes we just don't quite see it, just go have a walk and come back after a few minutes, and you will probably see it point blank range like it was the most obvious thing in the first place.

Either way if you're sure, then in that case I advise to check weather your request goes through some other device with caching capabilities (computers, routers, proxies, ...) until it gets to the intended destination.

Consider that most requests go through some of such devices mentioned before, more commonly routers, unless of course, you are directly connected to the Internet via your service provider network.

In one time my own router was caching the file, odd I know, but it was the case, whenever I rebooted it or connected directly to the Internet my caching problem went away. And no there wasn't any other device connected to the router that can be blamed, only the computer and router.

And by the way, a general advice, although it mostly applies to those who work in their company development computers instead of their own. Can by any change your development computer be running a caching service of sorts? It is possible.

Furthermore consider that many high end websites or services use Content Delivery Networks (CDN), and depending on the CDN provider, whenever a file is updated or changed, it takes some time for such changes to reflect in the entire network. Therefore it might be possible you were in the bad luck of asking for a file which might be in a middle of a update, and the closest CDN server to you hasn't finished updating.

In any case, specially if you are always requesting the same file over and over, or if you can't find where the problem lies, then if possible, I advise you to reconsider your approach in requesting the same file time after time, and instead look into building a simple Web Service, to satisfy the needs you first thought about satisfying with such file in the first place.

And if you are considering such option, I think you will probably have a easier time building a REST Style Web API for your own needs.

I hope this update is useful in some way to you, sure it would be for me while back. Best of luck with your coding endeavors.

Question&Answers:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You could try appending some random number to your url as part of a querystring each time you download the file. This ensures that urls are unique each time.

For ex-

Random random = new Random();
string url = originalUrl + "?random=" + random.Next().ToString();
webclient.DownloadFile(url, downloadedfileurl);

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

56.6k users

...