Tuesday, September 27, 2016

Unknown

HOW TO PROTECT YOURSELF FROM HACKER ?


Image result for HOW TO PROTECT YOURSELF FROM HACKER ?
Cyber Attacks are increasing day by day. Nowadays hackers are targeting anybody from anyplace. Here I show you how can your protect yourself from hackers and cyber criminals.
We collect  best tips and tricks for you in this post. Read this post and follow all steps for securing your Computer and your online activities.Every computer user needs to know the basic things to keep their device and data secure. Given below are the few tips and habits that can help you:

TIP 1: USE STRONG PASSWORDS

Passwords are your first line of defense in this digital war between hackers and the potential victims. If I can get your password, the rest is easy. Create strong Passwords password generator website.
Password Generator Website

 TIP 2: USE TWO-FACTOR AUTHENTICATION

Two-factor authentication offers the extra layer of security that protects you in case your password gets stolen. Turn this feature on in all the places where you can use Two-Factor-Authentication. Further, if you lose your phone (most often used as the authentication device), you can still get back into your account if you plan ahead.

TIP 3: NEVER CLICK ON SUSPICIOUS LINKS

I hope this is superfluous information, but NEVER click on a link sent to you in an email. I don’t care if it came from what appears to be trusted sources, such as your bank or friend, NEVER click on a link in your email. It is so easy for me to embed malware in that innocuous looking link that it is child’s play.
Never Click on Suspicious Links

TIP 4: DO NOT USE P2P FILE SHARING NETWORKS

Hopefully, this only applies to a few of you. Do NOT do use peer-to-peer file sharing sites. For the uninitiated, peer-to-peer (or P2P) file sharing is the uploading and downloading of music, videos, TV shows, movies, documents, and more from one computer to another without using a centralized server.
Music, movies, documents, and other files are really easy to embed malware in. This means that when you download files from P2P networks, you are giving me easy access to your system. In reality, nearly all of these files have malware in them. I can guarantee you that if you have downloaded at least one file from P2P, that your machine is infected with malware, probably irretrievably.
New security vulnerabilities (holes) are being discovered daily in your operating system (Windows 7 or 8, Linux, Mac OS X) and your applications (Word, Excel, Flash, IE8, Adobe Reader, etc.). When these vulnerabilities are found, hackers like me then develop a way to exploit that vulnerability.
Soon these “exploits” are passed around to other hackers and everyone is trying to use them against you. This then allows us to install our software on your system to control it and steal your resources and information
When the software developers such as Adobe, Microsoft, and Apple learn of these vulnerabilities, they then develop “patches” to close these security holes. They release these patches in the updates they offer you, sometimes daily. You must update to be secure!

TIP 6: USE ANTIVIRUS PRODUCTS & KEEP THEM UP TO DATE

Once again, I hope this piece of advice is superfluous. Everyone should have some form of antivirus software on their system. AV software is not perfect, but it is certainly better than nothing.
Even the best AV software is effective on about 95% of KNOWN malware (AV software is totally ineffective against unknown or zero-day malware). That means that one in 20 pieces of malware will be missed. Some of the lower quality AV software will miss 1 in 2 pieces of malware. In addition, AV software is only effective if its activated and updated, so make certain to update its signatures daily.

TIP 7: DO NOT USE ADOBE FLASH

Adobe’s Flash Player is on nearly every computer and even Android devices that install it manually. It enables us to run those interesting Russian dashcam videos as well YouTube, animations, etc. Without it, when you go a website with video or animations, you get that ominous looking message that you need to install Flash Player and a blank screen.
Flash Player is among my favorite pieces of code to hack. Nearly everyone has it and it is SO flawed. I know this is a radical step, but if you really want to make certain that your system is “bullet” proof, remove Flash Player from your computer, tablet, and smartphone. Even with updates, new vulnerabilities come out daily for this “hackers best friend.”

TIP 8: USE A REALLY GOOD FIREWALL

Although Microsoft ships a rudimentary firewall with its operating system, I strongly suggest that you install a third-party firewall for better protection.
There are many third-party software firewalls out there, some better than others, but I want to suggest Zone Alarm’s Free Firewall. As the name says, it is free and very effective. Not only does it block outsiders from getting in, but it also stops malware from accessing resources on your computer and talking out (hackers need to control the malware, so the malware must be able to communicate OUT to be effective).

TIP 9: MAKE YOUR PHONE’S LOCK CODE MORE SECURE

Many of us consider that the default 4-digit PIN is the most secure locking code. However, it is not. It is always better to add an extra digit to make your phone more secure. For iOS and Android, go to settings and add one more digit to make your phone’s lock code more. Further, Android also has lock screen tools that lets you enhance your phone’s security.

TIP 10: LOOK OUT FOR SOCIAL ENGINEERING ATTACKS

Social engineering is the biggest security concern these days, as cyber thieves and hackers smartly gain access to your secure information either through mimicking other companies, phishing, and other common strategies. You need to be careful of all the suspicious phone calls, emails, links and other communications that you receive. Also, it is known that most of the data breaches come from internal sources. Hence, awareness is the important key, as it may be astonishing to know that even security experts can be easily tricked or hacked into.

TIP 11: LOCK DOWN YOUR WIRELESS ROUTER

The first line of defence for your home network is your router. To keep your Wi-Fi secure, you need to change the router’s administrator login, use WPA2 (AES) encryption, and change other basic settings.

TIP 12: DON’T USE PUBLIC WI-FI WITHOUT A VPN

While using public Wi-Fi, it is important to use a network that has security. To stay safe on public Wi-Fi networks, your best defense will be to use a VPN (Virtual Private Network), which keeps you safe even in other conditions too.

TIP 13: BE CAREFUL WHILE USING THUMB DRIVE

Thumb drives, also known as flash drives, are small and easy storage devices to use across different computers. They are a popular device that people use to exchange files and documents. They can also spread viruses easily across computers and networks.

TIP 14: TRY NOT TO USE PUBLIC COMPUTERS

For many people, not using a public computer can be difficult. Those without a computer or Internet access at home often use Internet cafes to get online. However, the more different people use a computer, the more likely a virus has infected it.

TIP 15: CLEAR YOUR BROWSER HISTORY

This goes for all the devices you use in a day – your home computer, your work computer, or your friend’s iPad. Internet browsers like Firefox or Chrome keep track of where you’ve been and what you’ve done online. They keep records of every site you visited. Information about what you sent from or saved on your computer can be kept for days or weeks. It is very easy for anyone who sees that information to steal a detailed record of your online activities.
Source:-thenexthack
Read More
Unknown

HOW TO FIND VULNERABILITY IN A WEBSITE USING ROBOTS.TXT FILE?


WHAT IS A ROBOTS.TXT FILE?

Image result for HOW TO FIND VULNERABILITY IN A WEBSITE USING ROBOTS.TXT FILE?
An important, but sometimes overlooked element of onsite optimization is the robots.txt file. This file alone, usually weighing not more than a few bytes, can be responsible for making or breaking your site’s relationship with the search engines.
Robots.txt is often found in your site’s root directory and exists to regulate the bots that crawl your site. This is where you can grant or deny permission to all or some specific search engine robots to access certain pages or your site as a whole. The standard for this file was developed in 1994 and is known as the Robots Exclusion Standard or Robots Exclusion Protocol. Detailed info about the robots.txt protocol can be found at robotstxt.org.

BASIC ROBOTS.TXT EXAMPLES

Here are some common robots.txt setups (they will be explained in detail below).
Allow full access
User-agent: *
Disallow:
Block all access
User-agent: *
Disallow: /
Block one folder
User-agent: *
Disallow: /folder/
Block one file
User-agent: *
Disallow: /file.html

WHY SHOULD YOU LEARN ABOUT ROBOTS.TXT?

  • Improper usage of the robots.txt file can hurt your ranking
  • The robots.txt file controls how search engine spiders see and interact with your webpages
  • This file is mentioned in several of the Google guidelines
  • This file, and the bots they interact with, are fundamental parts of how search engines work

SEARCH ENGINE SPIDERS

The first thing a search engine spider like Googlebot looks at when it is visiting a page is the robots.txt file.
It does this because it wants to know if it has permission to access that page or file. If the robots.txt file says it can enter, the search engine spider then continues on to the page files.
If you have instructions for a search engine robot, you must tell it those instructions. The way you do so is the robots.txt file.

PRIORITIES FOR YOUR WEBSITE

There are three important things that any webmaster should do when it comes to the robots.txt file.
  • Determine if you have a robots.txt file
  • If you have one, make sure it is not harming your ranking or blocking content you don’t want blocked
  • Determine if you need a robots.txt file

DETERMINING IF YOU HAVE A ROBOTS.TXT

You can enter a website below, click go and it will detect if the site has a robots.txt file and display what the file says (it shows results here on this page).
If you do not want to use the tool above, you can check from any browser. The robots.txt file is always located in the same place on any website, so it is easy to determine if a site has one. Just add “/robots.txt” to the end of a domain name as shown below.
www.yourdomainname.com/robots.txt
If you have a file there, it is your robots.txt file. You will find a file with words in it, find a file with no words in it, or not find a file at all.

DETERMINE IF YOUR ROBOTS.TXT IS BLOCKING IMPORTANT FILES

You can use the Google guidelines tool, which will warn you if you are blocking certain page resources that Google needs to understand your pages.
If you have access and permission you can use the Google search console to test your robots.txt file. Instructions to do so are found here (tool not public).
To fully understand if your robots.txt file is not blocking anything you do not want it to block you will need to understand what it is saying. We cover that below.

DO YOU NEED A ROBOTS.TXT FILE?

You may not even need to have a robots.txt file on your site. In fact it is often the case you do not need one.
Reasons you may want to have a robots.txt file:
  • You have content you want blocked from search engines
  • You are using paid links or advertisements that need special instructions for robots
  • You want to fine tune access to your site from reputable robots
  • You are developing a site that is live, but you do not want search engines to index it yet
  • They help you follow some Google guidelines in some certain situations
  • You need some or all of the above, but do not have full access to your webserver and how it is configured
Each of the above situations can be controlled by other methods, however the robots.txt file is a good central place to take care of them and most webmasters have the ability and access required to create and use a robots.txt file.
Reasons you may not want to have a robots.txt file:
  • It is simple and error free
  • You do not have any files you want or need to be blocked from search engines
  • You do not find yourself in any of the situations listed in the above reasons to have a robots.txt file
It is okay to not have a robots.txt file.
When you do not have a robots.txt file the search engine robots like Googlebot will have full access to your site. This is a normal and simple method that is very common.

HOW TO MAKE A ROBOTS.TXT FILE

If you can type or copy and paste, you can also make a robots.txt file.
The file is just a text file, which means that you can use notepad or any other plain text editor to make one. You can also make them in a code editor. You can even “copy and paste” them.
Instead of thinking “I am making a robots.txt file”, just think, “I am writing a note” they are pretty much the same process.

WHAT SHOULD THE ROBOTS.TXT SAY?

That depends on what you want it to do.
All robots.txt instructions result in one of the following three outcomes
  • Full allow: All content may be crawled.
  • Full disallow: No content may be crawled.
  • Conditional allow: The directives in the robots.txt determine the ability to crawl certain content.
Let’s explain each one.

FULL ALLOW – ALL CONTENT MAY BE CRAWLED

Most people want robots to visit everything in their website. If this is the case with you, and you want the robot to index all parts of your site, there are three options to let the robots know that they are welcome.
1) Do not have a robots.txt file
If your website does not have a robots.txt file then this is what happens…
A robot like Googlebot comes to visit. It looks for the robots.txt file. It does not find it because it isn’t there. The robot then feels free to visit all your web pages and content because this is what it is programmed to do in this situation.
2) Make an empty file and call it robots.txt
If your website has a robots.txt file that has nothing in it then this is what happens…
A robot like Googlebot comes to visit. It looks for the robots.txt file. It finds the file and reads it. There is nothing to read, so the robot then feels free to visit all your web pages and content because this is what it is programmed to do in this situation.
3) Make a file called robots.txt and write the following two lines in it…
User-agent: *
Disallow:
If your website has a robots.txt with these instructions in it then this is what happens…
A robot like Googlebot comes to visit. It looks for the robots.txt file. It finds the file and reads it. It reads the first line. Then it reads the second line. The robot then feels free to visit all your web pages and content because this is what you told it to do (I explain this below).

FULL DISALLOW – NO CONTENT MAY BE CRAWLED

Warning: This means that Google and other search engines will not index or display your webpages.
To block all reputable search engines spiders from your site you would have these instructions in your robots.txt:
User-agent: *
Disallow: /
It is not recommended to do this as it will result in none of your web pages being indexed.

THE ROBOT.TXT INSTRUCTIONS AND THEIR MEANINGS

Here is an explanation of what the different words mean in a robots.txt file

USER-AGENT

User-agent:
The “User-agent” part is there to specify directions to a specific robot if needed. There are two ways to use this in your file.
If you want to tell all robots the same thing you put a ” * ” after the “User-agent” It would look like this…
User-agent: *
The above line is saying “these directions apply to all robots
If you want to tell a specific robot something (in this example Googlebot) it would look like this…
User-agent: Googlebot
The above line is saying “these directions apply to just Googlebot”.

DISALLOW:

The “Disallow” part is there to tell the robots what folders they should not look at. This means that if, for example you do not want search engines to index the photos on your site then you can place those photos into one folder and exclude it.
Let’s say that you have put all these photos into a folder called “photos”. Now you want to tell search engines not to index that folder.
Here is what your robots.txt file should look like in that scenario:
User-agent: *
Disallow: /photos
The above two lines of text in your robots.txt file would keep robots from visiting your photos folder. The “User-agent *” part is saying “this applies to all robots”. The “Disallow: /photos” part is saying “don’t visit or index my photos folder”.

GOOGLEBOT SPECIFIC INSTRUCTIONS

The robot that Google uses to index their search engine is called Googlebot. It understands a few more instructions than other robots.
In addition to “User-name” and “Disallow” Googlebot also uses the Allow instruction.

ALLOW

Allow:
The “Allow:” instructions lets you tell a robot that it is okay to see a file in a folder that has been “Disallowed” by other instructions. To illustrate this, let’s take the above example of telling the robot not to visit or index your photos. We put all the photos into one folder called “photos” and we made a robots.txt file that looked like this…
User-agent: *
Disallow: /photos
Now let’s say there was a photo called mycar.jpg in that folder that you want Googlebot to index. With the Allow: instruction, we can tell Googlebot to do so, it would look like this…
User-agent: *
Disallow: /photos
Allow: /photos/mycar.jpg
This would tell Googlebot that it can visit “mycar.jpg” in the photo folder, even though the “photo” folder is otherwise excluded.

TESTING YOUR ROBOTS.TXT FILE

To find out if an individual page is blocked by robots.txt you can use this technical SEO tool which will tell you if files important to Google are being blocked and also display the content of the robots.txt file.

KEY CONCEPTS

  • If you use a robots.txt file, make sure it is being used properly
  • An incorrect robots.txt file can block Googlebot from indexing your page
  • Ensure you are not blocking pages that Google needs to rank your pages

NOW THAT YOU HAVE READ THE WHOLE ARTICLE THEN YOU WOULD BE PRETTY SURE ON HOW WE ARE GOING TO FIND THE VULNERABILITY IN A WEBSITE WITHOUT EVEN FIRST HACKING IT.


Read More
Unknown

WHAT ARE PHISHING SCAMS AND HOW TO PROTECT YOURSELF ?


Image result for WHAT ARE PHISHING SCAMS AND HOW TO PROTECT YOURSELF ?

WHAT IS PHISHING SCAM ?

According to the Social-Engineer.org,  phishing is a  “practice of sending emails that appear to be from a reputable source but with the goal of gaining your personal information,banking information, credit card details, and passwords.”

The most common examples of phishing mails are “You won  iPhones, register now to select what color iPhone you want” or  “Urgent: You are entitled to a Tax Refund”.

HOW TO PROTECT YOURSELF ?

  • Check the URL before clicking on a link:- If you have even a little doubt about the safety of the email then don’t click on any of those links — even if they look like legitimate stuff. It is very easy to hover over the link and see where it is directed . The better method is to manually navigate to the website itself and log in there than using the URL you received.
  • Don’t download attachments. The most used method to infect your device with malware is by making you download the attachments. Most of the web based mail clients scan attachments to protect you, but this isn’t foolproof. If you have to download an attachment, scan it with an antivirus software before you open it. If you find the document has a weird file extension then do not open as it may contain some malware in disguise  as “Document.pdf.exe”. Just to be safe, never open or even download) files with “.exe” attachments.
  • Check the Sender’s address. This can be tricky to do on mobile, and since attackers know this they are increasingly building this into their attacks.
  • Take an Active Role.Internet links, phone calls and emails sometimes harbour ill intent. Many phishing schemers attempt to trick you in to giving up personal information like bank account and Social Security numbers. It is important to be extra cautious when information is requested through one of these avenues. For example, if you receive a phone call from someone claiming to work for your bank hangs up the phone and calls the number on the back of your debit card.
  • Avoid Popups.Many phishing scams involve pop-up screens that ask for information like passwords and zip codes. To prevent identity theft, avoid entering personal data in those popups.
  • Update Your Browser.Internet browsers depend on regular updates to guard against the latest known threats. When your browser prompts you to update, don’t put it off, as this reduces internet surfing security.
    Hopes this article help you to know about Phishing scams, mode of attack And tips to avoid them. Share it and help others to protect their personal data.
  • Source:-thenexthack
Read More