it helps some times What you need to do is use the post method - you can't get around this if your URL is going to be over the limit (although you can delay the inevitable by abbreviating names and values). Take the post and then redirect the client to a get, to keep things addressable you can either store the search server side against a key and retrieve it on the get (e.g. from memory or a database), or you could encode the keys into a single querystring key or a smaller number of keys which capture the behaviour. The resulting response will be bookmarkable etc.
Making a large image smaller using CSS (downscaling) results in blurry images in IE9, FF, but fine in Chrome
help you fix your problem Generally you shouldn't do this, reason being is the user will download the larger image. Imagine you have a 2Mb image you want to show on a website, the user would have to download the 2Mb file just to view the smaller image. It is best practice to resize the image and create a thumbnail link.
To fix the issue you can do im making a script that gets some rows from mysql and returns them to user after some time.those rows numbers got increased like 300 and now loading page takes a little time. i wanted to page them.every page contain 50 of them so i have 6 pages and i mean: , Use LIMIT in your SQL statements.
SELECT * FROM `wherever` ORDER BY `whatever` LIMIT 0,50
SELECT * FROM `wherever` ORDER BY `whatever` LIMIT $start,50
like below fixes the issue I'm trying to work out how I can paginate my data efficiently. If I have a url such as: example.com/items?page=5, how can I retrieve the correct records every single time (so that the same record doesn't show up on another page)? I thought to I'd have to first sort the DB records first, then use mongoose to do a range query such as the one below. , You can work on following query:
var query = Model.find(conditions).sort(sort).skip(skip).limit(limit);