LaxLoafer 71 Posting Whiz in Training

What reasons make you suspect it's the RAM?

I think it's more likely a CMOS battery issue, because you mentioned the computer is old, it was shipped without power, and you had to reset the system clock.

A blown fuse also seems unlikely given that you succeeded in charging the battery. Actually, I can't recall seeing a fuse inside the cases of a VAIO before. Opening them up is tricky, as there tend to be hidden lugs that break off is you're not sure what you're doing, but you'll find it easier if you have access to a service manual.

If the laptop is very old I would seriously question whether it's worth sending back for repair. Laptop screens tend not to last forever and are prohibitively expensive to replace. You might get the current issue fixed only to find something else breaks a few months down the road. How old is your VAIO?

LaxLoafer 71 Posting Whiz in Training

Try removing the battery. I'd normally expect the Vaio to work off mains alone. If it doesn't, this might suggest a problem with your power cable or adapter. Let us know whether it works.

When charging batteries that have been completely exhausted it's not unusual for the charging indicator light to take several minutes before blinking. At least that's my experience with Vaios.

You mention the laptop is old and had been shipped somewhere without its battery installed. When no power is supplied, battery or mains power, the CMOS battery will discharge more quickly. Did you noticed whether the system clock had changed? If so, the CMOS battery may need replacing. Typically they need replacing every 5 years.

Another posibility is that something may have broken off from the recent soldering work. I know it's not very likely, but I'd still give the laptop a gentle rock back and forth to listen for anything unusual rolling around inside, with the power turned off.

LaxLoafer 71 Posting Whiz in Training

Hi Kwesi

Have a read of the GNU Public License, just the preamble section. It should give you a general idea of what opensource is about. There are of course other opensource licenses, AGPL, MPL, Apache and MIT.

If you're looking to include opensource software in your projects, make sure you read the full license and comply with any requirements. Typically you'll need to acknowlege any code you have used and redistribute a copy of the license. Some opensouce licenses might oblige you to make your own code public.

Proprietary software license in comparison are closed source. You're not usally free to look at or modify the code. If your business depends on proprietary software, check to see if the vendor offers an escrow agreement. These can provide a level of safety, access to their sourcecode in some circumstances.

LaxLoafer 71 Posting Whiz in Training

To access the char using a pointer the pointer must be dereferenced, which is additional work for the processor. The size of a pointer is also larger so might consume more memory, but it depends on how it's laid out in memory. As the char is inside a struct, and I would expect the struct to aligned to an addressable space, the difference between a pointer size and char size isn't going to be an issue. I'd guess your example without a pointer is going to be slightly more efficient.

LaxLoafer 71 Posting Whiz in Training

Have you installed the Windows Azure SDK for your development environment?

Once you have done so you'll find options for creating Azure projects and deploying them from within Visual Studio.

Azure provides a variety of ways to host web sites:

  • Windows Azure Web Sites (WAWS)
  • Web Roles (Cloud Services)
  • Azure Virtual Machines

I'm unsure whether WAWS supports ASP - it's a rather dated scripting language. What you could try is creating an ASP.NET project and then adding an ASP page to see if it's enabled.

The other two options, Web Roles and VMs give you more flexibility and can be configured to run ASP.

LaxLoafer 71 Posting Whiz in Training

If you're able to remove the back-links then there's no need to disavow them. The disavow tool is intended to be used as a last resort, when all other efforts have failed.

Whether you should remove the links is another matter. Are they having a detrimental affect? This is something you could possibly test for, as the site is under your control, but you might find the answer is already in your analytics data. Did traffic from organic search results fall off when the links were introduced?

I suspect you'll find the 22,000 links have little influence.

Thousands of links from a single domain, all with exactly the same anchor text, is strongly indicative of a footer link. It's not difficult to identify them and I wouldn't be surprised if search engines simply count them all as just one.

As for linking from a site under your control to another, this has to be one of the easiest ways to gain a back link. What value do you think search engines might place on such a link?

Will the link get flagged as spam by Matt Cutts? I think this depends on relevancy, and possibly topic. For example, linking from a fashion site to 'pay day loans' and with misleading link text could look extremely spammy. You should be safe. Is Matt is reading this post?

LaxLoafer 71 Posting Whiz in Training

You might be able to disable directory browsing in your web server's configuration. Visitors should then see an HTTP 403 error page instead.

Another option would be to redirect requests for the URL to products.html, or some other page.

If a web server finds a default web page located in the directory it will serve that instead of a directory listing. Default web pages are typically named something like 'index.htm', 'default.aspx', or 'index.php'. It depends on your configuration.

So, if you wish to prevent visitors from browsing a directory, you might get away with simply dropping in an empty default page.

Which web server are you using?

LaxLoafer 71 Posting Whiz in Training

dd if=/dev/zero of=/dev/sda bs=1M

Simply filling a drive with null characters is insufficient to prevent data from being recovered. The process of sanitizing a disk properly invovles slightly more work.

Standards such as HIPAA set out specific requirements for sanitizing hard drives. A typical requirement is to set all bits to 1s, then 0s, followed by overwriting with a random value.

LaxLoafer 71 Posting Whiz in Training

Hi M,

There are a variety of ways you can pass data between pages. Have you considered using cookies, session variables, query strings, or form requests?

Could you tell us which methods you have tried. Which methods do not match your requirement and why?

You may find Microsoft's ASP.NET Session State Overview page helpful.

LaxLoafer 71 Posting Whiz in Training

Taking ownership of a file owned by SYSTEM isn't normally a problem. Perhaps something is holding the file open? Have you tried booting into safe mode and attempting to take ownership?

If you succeed I expect you'll then need to assign some permissions, to allow you to delete the file, otherwise you would see an access denied error.

Did Gerbil's suggestion work out?

LaxLoafer 71 Posting Whiz in Training

You might need to take ownership of the file before you can delete it. Log in to a command prompt as administrator and type takeown /? to get more help.

LaxLoafer 71 Posting Whiz in Training

There's a C implementation in the Adobe PostScript SDK. Have a look on the developer site, under filters...

LaxLoafer 71 Posting Whiz in Training

Hi Shika

I'm unsure what you meant by...

...the division of class should come inside the second division

Which class were you referring to? Could you explain further?

Looking at your code, there are a couple of problems I can immediately see.

On line 29 you're attempting to append 'e' to a div in the document body. As e is the document.body, I don't believe this will work.

On the following line you're attempting to set the hidden attribute of an input element to zero, referenced by its ID. Unfortunately there are two elements on the page named 'done'. Element IDs should be unique.

LaxLoafer 71 Posting Whiz in Training

Is that URL correct, or have the NSA squished it already? Doesn't seem to be working.

LaxLoafer 71 Posting Whiz in Training

In what way could I make the post better or at least look better?

It's generally good practice to keep code as short as possible, especially when seeking help. Minimal examples are easier to read and can help others to understand what your code is about.

Putting your characters into classes will help you get to grips with OOP principles. Grab any C++ text book and you'll invariably find there's some tutorial that closely matches what you're trying to do.

With a map, would it be better to do a map the way I have it, or storing the map in an array of each different area?

By 'map', I take it you're referring to a map of in-game locations and not an associative array (map). Looking at your code, the bulk of it defines various in-game locations or areas. Each location is defined using a function, which contains a description and quite a lot of duplicate code. While this works, it's less than optimal and doesn't allow you to create or modify locations at runtime. Using an array would be an improvement, but for something more sophisticated have a look into using 'linked lists', when you're ready ;-)

LaxLoafer 71 Posting Whiz in Training

There are plenty of standalone sitemap generators around. Unfortunately I cannot recommend any in particular, but it will be better to choose a server-side tool. Online sitemap generators won't be able to discover content that isn't already linked.

Try this search: http://duckduckgo.com/?q=server-side+sitemap+generator

Also, a sitemap generator is the type of feature I'd expect to find built-in to a content management system, or possibly available as an add-on. Do you have a CMS? What web server software are you using?

LaxLoafer 71 Posting Whiz in Training

What version of Internet Explorer and Windows are you running?

With the release of IE9, Microsoft deprecated an interface that components like ABCpdf depended on for rendering HTML. This issue affects ABCpdf version 8 and older. I would suggest trying the latest version to see if it resolves the problem. Otherwise, if you're tied to using an older version, you might try downgrading to IE8, but I can't recommend or guarantee that'll work.

LaxLoafer 71 Posting Whiz in Training

Was it a Windows or ABCpdf update that affected your installation? Did you try both MSHTML and Gecko rendering engines?

LaxLoafer 71 Posting Whiz in Training

Adding the extra URLs won't hurt. And the sitemap will allow you to specify things like priority and update frequency of a page, which search engines will take as a hint.

If your sitemap tool has missed entries, you can always add them manually in a text editor. More information on sitemaps can be found here: http://www.sitemaps.org/

Google search engine page results are typically just a sample of data. They tend to understate the number of pages actually indexed for some reason. Google Webmaster Tools on the other hand should give you a better idea of how many pages are index. BTW, your screen grab shows pages indexed from your sitemap. Have a look around GWT and you should see a count of all pages indexed.

LaxLoafer 71 Posting Whiz in Training

Search engines will crawl any public content on your website and possibly index it. A URL doesn't necessarily have to be in your sitemap. As long as the content is linked from another known source, sooner or later it will get discovered.

If you wish to restrict indexing of content, consider using the robots.txt file, or meta robots tag, which all well behaved search engines will observe. Bad bots will just ignore your directions and crawl regardless.

If you need to prevent content from being indexed, make it private, i.e. require some form of authentication.

Sometimes URLs can get indexed multiple times. This can happen, for example, when you have dynamically generated content. You might have one URL, but perhaps the content is displayed differently depending on the querystring, e.g.

http://example.com/some-list.php?sort=ascending
http://example.com/some-list.php?sort=descending

Only one page, but two URLs!

Search engines tend to regard this as duplicate content. In this situation, best practice is to use 'link rel canonical' to avoid a penalty.

LaxLoafer 71 Posting Whiz in Training

You could probably use cURL or wget to produce a copy of the site.

These utilities are commandline HTTP clients. They'll see what a web browser normally sees, enabling you to capture HTML, JavaScript, stylesheets, and images. Dynamically generated content may also be captured, but any server-side code that generated it will not. Your ASP.NET code isn't visible to web clients, so it will be excluded.

In order to produce a copy of the dynamically generated content you'll need to have a running ASP.NET site for the HTTP clients to crawl. If you have content that isn't linked in some way then you may need to provide the crawlers a sitemap.

LaxLoafer 71 Posting Whiz in Training

Rubberman! Are you suggesting people aged over 40 don't know about computers? :-o

LaxLoafer 71 Posting Whiz in Training

Didn't realize photos could be tagged in that way. Thanks for the tip. Seems like your solution may not work for all squashed images though. Adding tags to the one sample I have made no difference. Still squashed.

LaxLoafer 71 Posting Whiz in Training

There are tools for that, for example...

However, online sitemap generators are unable to discover content that isn't linked in some way. If you have a few URLs missing from the generated sitemap it's easy enough to add them manually. Otherwise you may want to look for a server-side solution.

LaxLoafer 71 Posting Whiz in Training

...but you can accomplish the same thing (grep + sed G) in one line of awk

True, but piping the output to sed makes it pretty trivial...

grep "some pattern" some-file.txt | sed G
LaxLoafer 71 Posting Whiz in Training

In case you've missed the news, Google finished rolling out Penguin 2.0 yesterday. They're expecting the algorithm change to have a noticeable affect on 2.3% of US English queries. Further info on Google's Webmaster Central Blog: Another step to reward high-quality sites

Do we have any winners and losers here?

LaxLoafer 71 Posting Whiz in Training

I want to do Medical transcription practice at home but it can be worked only in Windows XP Professional.

Could I ask why you're limited to Windows XP Professional?

Is the problem is a software compatibility issue, and have you tried running your application under the Windows XP compatibility mode?

LaxLoafer 71 Posting Whiz in Training

... install Windows 7 as the [host], then install a virtualization app such as VirtualBox.

Excellent suggestion.

With a VM like VirtualBox there's no need to mess around with creating separate partitions, adding additional hard drives, or configuring boot managers. No need to shutdown one OS in order to use the other.

LaxLoafer 71 Posting Whiz in Training

The output of grep and other commands can be saved to a file by using a greater-than '>' symbol to redirect the standard output stream. Try something like...

grep "some pattern" some-file-to-search.txt > the-results.txt

To double space the resulting file, you might want to try 'sed', the steam editor. See if this works...

sed G your-file.txt
LaxLoafer 71 Posting Whiz in Training
LaxLoafer 71 Posting Whiz in Training

Someone I don't know was given an encrypted hard disk and a USB stick containing the keys. The hard disk had a glass platter. He was instructed to hide the USB stick somewhere safe, and 'accidentally' drop the hard disk if challenged. The glass platter would break and any data would be unrecoverable.

Unfortuantely it wasn't his lucky day, and when customs pulled him over he dropped the USB stick instead. I guess he must have been walking funny.

LaxLoafer 71 Posting Whiz in Training

Gaining a ranking of 1 would not be unreasonable after nine months work. It's possible your site may have been penalized for some reason. As in-bound links don't appear to be the issue, perhaps there's a Panda holding your site back.

Are there any similarities between your site and the type of site that Google's Panda algorithm is designed to catch?

LaxLoafer 71 Posting Whiz in Training

How often does google udate ranking?

Quite frequently, apparently. But the PageRank seen in Google's toolbar is only updated roughly every quarter, which makes it quite a blunt tool.

LaxLoafer 71 Posting Whiz in Training

Is there any site to submit my sitemaps?

You shouldn't really need to. So long as a reference to the sitemap has been included in your robots.txt file, all the main public search engines should pick it up automatically.

LaxLoafer 71 Posting Whiz in Training

How can I get my articles to the top of the crawl?

Tell Google your articles are more important than other content on your site. Give the URLs a relatively high priority in your sitemap.

Don't bury your content. Rewrite the URLs so that your articles appear closer to the root folder, e.g.
http://example.com/read-all-about-it.htm

Should I put my name on the top or can I just use my site name?

Why not Both? Google seem to be keen on identifying authors, so I think this would be a great idea. Best to avoid claiming authorship of external content though. You don't want to get mistaken for a content scraper.

LaxLoafer 71 Posting Whiz in Training

Art, the sitemap in your root directory contains nearly 400 items. I only checked a few of the URLs, but most of them look like they point to articles found on AP, CNN, etc. You're asking Google to crawl content they've almost certainly seen already. It's unlikely (unless I'm a banana) that you'll ever be able to out rank the original source.

You've also assigned each one a high priority (0.8). The priority tag gives search engines a hint as to what you think is important on your site, and it's relative to content on your site. By default, your other content will have a priority of 0.5. So currently, you're telling Google the best content on your site is from other sources!

I'm not professing that fixing this will resolve your ranking issues, but why not use sitemaps to promote YOUR content?

Bear in mind that search engines allocate only a limited amount of time for crawling each site. Sites that are of low quality probably get less time than others. By thinning out a sitemap you can make it easier for them to crawl and thus index your content.

LaxLoafer 71 Posting Whiz in Training

Made it to 101 posts. Yay! Can I get a puppy?

LaxLoafer 71 Posting Whiz in Training

True, the market is competitive. But doesn't it seem odd that Art's site has zero PR, given that Google has crawled so many pages?

LaxLoafer 71 Posting Whiz in Training

If Google detects unnatural links to a site I believe they'll notify you via GWT. Did you receive a notice?

Although 14,000 links sounds a lot there can be a perfectly valid explanations. For example, what would happen if your link appeared on a dynamically generated page with a unique URL? Telling Google about any query string parameters used might help, if the linking site is under your control. This type of issue is fairly common and expect Google is clever enough to recognize such links. I wouldn't be surprised if they treat a thousand links as one, if coming from a single domain.

It might look more suspicious if you had suddenly gained a few hundred in-bound links from multiple domains. I wouldn't be too concerned about the 14k links.

LaxLoafer 71 Posting Whiz in Training

Hi Arthur

How did you acquire the backlinks? Is it possible you've violated Google's webmaster guidelines in some way?

Note that back links aren't the only ranking factor. It's important to have original content on your site. If Google finds the same content elsewhere, it will attempt to identify the original source. The original will be displayed in preference and any copies will be masked from search results. Does your site contain much duplicate content?

You may also want to check that Google and visitors to your site see the same content. Try the 'fetch as google bot' tool in Google Webmaster Tools.

Viewing a site without javascript or styles enabled can sometimes help to understand what crawlers see. Is the content too sparse? Does it resemble a portal or list of links?

LaxLoafer 71 Posting Whiz in Training

If you want to tell search engines about unlinked content on your site, content that isn't normally discoverable by crawling pages, you may find sitemaps help.

LaxLoafer 71 Posting Whiz in Training

Erum,

When the button is clicked your function will be called and the timer started. Just a few milliseconds after the call to setTimeout() your function will exit and the form is posted back to the server. When this happens you'll loose any environment or state. The timeout you set will no longer exist.

If you prevent the form from being posted back, you should find the timeout works fine. On line 18, try something like...

<form id="form1" runat="server" onsubmit="return false">
LaxLoafer 71 Posting Whiz in Training

Manute Bol

Favorite Buzkashi player?

LaxLoafer 71 Posting Whiz in Training

Is it running as a service? You may need to manually stop the service before deleting or uninstalling the software.
To stop a service, log in as an administrator and run services.msc, or use net stop <service name> from the command line.

LaxLoafer 71 Posting Whiz in Training

Which edition of Windows Vista are you using? If it's Ultimate or Business you may be able to recover the missing file using the Previous Versions feature, providing you also have System Restore enabled.

See: Recover lost or deleted files on Microsoft's site, specifically the section on "Restoring files from previous versions".

There's also some info pertaining to Vista here: Previous versions of files: frequently asked questions

LaxLoafer 71 Posting Whiz in Training

Is a vBulletin license transferrable?

LaxLoafer 71 Posting Whiz in Training

Hi Beep

Have you checked the EXIF data? The x-resolution and y-resolution fields should match the image. If they don't, Windows Photo Viewer uses these values regardless and displays a squashed image.

This sort of problem can apparently occur when a photo is resized or rotated, typically with older image editing software. If the software doesn't recognize EXIF, or support asymmetrical resolutions, the EXIF data won't get updated appropriately.

A quick search on the Internet might suggest a tool that will batch fix the images for you, or report the correct EXIF information. Can anyone recommend a utility?

LaxLoafer 71 Posting Whiz in Training

It certainly helps to know which APIs to search for. You can find out what interfaces an application exports with Microsoft's OleView, or some other COM viewer.

Although the utility was removed from VS2010, the source code can be found in the VS samples directory:

C:\Program Files (x86)\Microsoft Visual Studio 10.0\Samples\1033\VC2010Samples.zip

A compiled version can also be found in the Windows SDK, available from MS.

LaxLoafer 71 Posting Whiz in Training

You dont need anything. All you need is a text editor

I agree with Jorge, but I would like to recommend using an HTML/css editor instead. Although HTML and CSS can certainly be edited in a text editor, they don't generally provide useful features like HTML and CSS validation or code completion. A decent HTML editor can help you avoid the type of mistakes that beginners often make. It'll save you a lot of frustration.

LaxLoafer 71 Posting Whiz in Training

Hi Somjit

It's unnecessary to have a website in order to learn HTML and CSS. If you save web pages and style sheets to your local file system you should find any modern web browser is able to open them.

However there are some aspects of the web that are easier to learn if you have access to a web server. Without a server you may struggle with things like forms, cookies and AJAX requests.

Once you've got to grips with the basics of HTML and CSS, have a go at installing a web server locally. That way you can create your own websites locally, for free. As many as you like :-)

Two popular web servers are IIS and Apache, both of which are available for free.