HTTPS And Why The Internet Still Isn’t Secure

Frank DeLuca is a field technician for Tech Experts.

HTTPS stands for “Hyper Text Transfer Protocol Secure” and it is the secure version of HTTP, the protocol over which data is sent between your browser and the website you’re connected to.

Most web traffic online is now sent over an HTTPS connection, making it “secure.” In fact, Google now warns that unencrypted HTTP sites are “Not Secure.”

So why is there still so much malware, phishing, and other dangerous activity online?

“Secure” Sites Have a Secure Connection

In previous iterations of Chrome, it used to display the word “Secure” along with a green padlock in the address bar when you were visiting a website using HTTPS. Modern versions of Chrome simply have a little gray padlock icon next to the navigation bar, without the word “Secure.”

That’s partly because HTTPS is now considered the new baseline standard. Everything should be secure by default, so Chrome only warns you that a connection is “Not Secure” when you’re accessing a site over an HTTP connection.

The reason for the removal from displaying the word “Secure” is that it may have been a little misleading. It may have easily been misconstrued to appear like Chrome was vouching for the contents of the site as if everything on the page is “secure.” But that’s not true at all. A “secure” HTTPS site could be filled with malware or phishing attempts.

HTTPS Does Not Mean A Site is “Secure”

HTTPS is a solid protocol and all websites should use it. However, all it means is the website operator has purchased a certificate and set up encryption to secure the connection.

For example, a dangerous website full of malicious downloads might be delivered via HTTPS. The website and the files you download are sent over a secure connection, but they might not be secure themselves.

Similarly, a criminal could buy a domain like “,” get an SSL encryption certificate for it, and imitate Bank of America’s real website. This would be a phishing site with the “secure” padlock, but again, it only refers to the connection itself.

HTTPS Stops Snooping and Tampering

Despite that, HTTPS is great. This encryption prevents people from snooping on your data in transit, and it stops man-in-the-middle attacks that can modify the website as it’s sent to you. For example, no one can snoop on payment details you send to the website.

In short, HTTPS ensures the connection between you and that particular website is secure. No one can eavesdrop or tamper with the data in-between.

HTTPS Is An Improvement

Websites switching to HTTPS helps solve some problems, but it doesn’t end the scourge of malware, phishing, spam, attacks on vulnerable sites, or various other scams online.

However, the shift toward HTTPS is still great for the Internet. According to Google’s statistics, 80% of web pages loaded in Chrome on Windows are loaded over HTTPS. Plus, Chrome users on Windows spend 88% of their browsing time on HTTPS sites.

This transition does make it harder for criminals to eavesdrop on personal data, especially on public Wi-Fi or other public networks. It also greatly minimizes the odds that you’ll encounter a man-in-the-middle attack on public Wi-Fi or another network.

It’s still no silver bullet. You still need to use basic online safety practices to protect yourself from malware, spot phishing sites, and avoid other online problems.

Back At It Again: Microsoft Suspends Windows Updates

Jason Cooley is Support Services Manager for Tech Experts.

Windows 10 was released in July 2015 and there were plenty of reasons to be excited. If you have been around for the last few versions of Windows dating back to Vista, you may have a love/hate relationship with Microsoft.

Windows Vista, for instance, was once known as the biggest failure Microsoft had experienced. That is, until Windows 8. Just using the adoption numbers, it’s clear that Windows 8 was the least successful OS that Microsoft has ever released.

So, Microsoft and their users had many reasons to be excited about Windows 10. Microsoft assured users that Windows 10 would be a return to the golden standard of Operating Systems: Windows 7.

As with all releases of a new operating system, there have been some issues. Some of these problems are indicative of a bigger problems while others are standalone issues.

With a myriad of different types of problems that have surfaced over the last couple of years, Windows 10 may be the most problematic OS of all-time.

Since launch, Windows 10 has had some very unusual problems. While it is almost expected for issues to arise with a new OS, the frequency and type of problems is what’s disturbing. The issues have ranged from broken drivers that leave devices nonfunctional to our latest and greatest issue: the deleted documents folder.

A few times a year, larger updates called “Feature Updates” are released. In April 2018, there was an update that would incorrectly create a duplicate of your documents folder. A lot of these folders were empty and had no real purpose.

At this point, Microsoft decided to implement a fix with their next feature update, due in October 2018. The “fix” would remove the duplicate folder.

There was one very large issue with this. The update did not check if the folder was actually empty before deleting it from your system. People all over began reporting the issue where, all of a sudden, their files were gone.

Once reported, Microsoft acted quickly to halt the update before further systems were affected. The update would still download but would not apply. It was necessary that the access to the update be stopped to save additional systems from data loss.

A strange side effect of the update being put on hold was the failure to apply the downloaded Windows updates.

This resulted in much longer shut down/restart times as the update would attempt to apply, then roll back once it failed. This also provided users with another reason to be frustrated.

The issues are now resolved. The fix has been implemented and there is no more possibility for further data loss.

For what it’s worth, Microsoft also asked for users who lost data to reach out, and they would try to recover it where possible.

It seems like the least they could do considering the issue was created due to poor planning, poor programming, or some combination of those.

When possible, look into deferred updates. Let the problems work themselves out before taking on the unnecessary problems.

What You Need To Know About Network Security Devices

Scott Blake is a Senior Network Engineer with Tech Experts.

With cyber hacking, identity theft and malware programs on the rise, it’s become even more important to protect your business networks from cyber invaders. One of the best ways to accomplish this is through the use of network security devices and installed anti-virus software.

Security devices attached to your network will act as a front line defense against threats. It behaves as an anti-virus and anti-spyware scanner and a firewall to block unauthorized network access.

It also acts as an Intrusion Prevention System (or IPS, which will identify rapidly spreading threats like zero day or zero hour attacks) and a Virtual Private Network (VPN), which allows secure access via remote connections.

Security devices come in four basic forms: Active, Passive, Preventative and Unified Threat Management (UTM). Active devices with properly configured firewalls and security rules will be able to block unwanted incoming and outgoing traffic on your network.

Passive devices act as a reporting tool that scans incoming and outgoing network traffic, utilizing IPS security measures. After reviewing these reports, the Active devices can be adjusted to close any detected security holes.

Finding and correcting possible security concerns is accomplished through the use of Preventative devices. These devices scan your network and identify potential security problems.

They will generate a detailed report showing which devices on your network need improved security measures.

UTM devices combine the features of Active, Passive and Preventive devices into one compact device. UTM-enabled devices are the most commonly found security device in small and medium-sized businesses.

By incorporating all the features into one device, your network administrator is able to more easily manage and maintain the security of your network. This greatly reduces overhead to your business.

Many businesses think they know what security measures need to be in place. Often, security professionals will find basic or home-class routers installed in companies.

While the upfront cost of the home-class router is lower than a business-class security device, the fact of the matter is that the home-class routers don’t offer the features and security a business needs to protect their network.

Companies electing to use home based devices run a much higher risk of finding themselves the victims of cyber attacks.

Information security. Shield covers laptopBefore purchasing any security device, it’s best to consult with a security professional. Have penetration tests performed and a vulnerability assessment report generated.

The report coupled with the advice of the security professional will guide you in determining what device is best for your network and business.

The benefits to having a proper and professionally-installed security device in place include protection against business disruption, meeting mandatory regulatory compliances, and protection of your customers’ data, which reduces the risk of legal action from data theft.

Along with the proper security device in place, you also want to make sure every device on your network is running a robust anti-virus program.

Managed anti-virus platforms are best for any business. Your network administrator can manage, update, scan and remove any threats found on any system attached to the network. This greatly reduces overhead and employee interruption.

For professional advice on security device installation, anti-virus solutions, or if you’re interested in network penetration testing, call Tech Experts at (734) 457-5000.

(Image Source: iCLIPART)

IT Policies Companies Under HIPAA Regulations Must Have

Thomas Fox is president of Tech Experts, southeast Michigan’s leading small business computer support company.

HIPAA (the Health Insurance Portability and Accountability Act) and HITECH (the Health Information Technology for Economic and Clinical Health act) have been around for quite some time. Even so, many companies covered by these laws are way behind when it comes to implementation. When you really think about it, even companies not covered by these laws should have the requisite policies and procedures in place.

Access Control Policy
How are users granted access to programs, client data and equipment? Also includes how administrators are notified to disable accounts.

Security Awareness Training
Organizations must ensure regular training of employees regarding security updates and what to be aware of. You must also keep an audit trail of reminders and communications in case you’re audited.

[Read more…]

Network Security And The “People Problem”

Michael Menor is Vice President of Support Services for Tech Experts.

Security teams that focus on what is already happening and the layers of defense being breached are constantly in reactive mode.

Reviewing reams of data produced by technology – firewalls, network devices or servers – is not making organizations more secure. With this approach, the team fails to prevent breaches or respond in a sufficiently timely way.

Instead, the addition of more data and more complexity perversely prevents achieving the end result: protecting sensitive information.

The significant breaches of today are executed by people infiltrating the organization and attackers are doing this by assuming identities or abusing insider privileges.

There is a gap between the initial line of defense (the firewall) and the company’s last line of defense (the alerts received by the security team and their following analysis.)

Tracking user activity, especially connections between suspicious behaviors and privileged users, would allow organizations to close this gap.

True understanding of identity has the ability to cut through the overwhelming explosion of data that can render security organizations blind and unable to respond to real threats or even detect if they are under attack.
It is time to incorporate identity into the organization’s breach prevention strategy and overall security. We have to stop accepting a gap approach to security, which is usually focused on data and devices rather than people. In light of the budding perimeterless world, identity will increasingly be the primary factor that matters to the security team.

Identity data is pervasive, yet typically absent from the security world view. For security organizations, our corporate identity (the personal identity elements we bring to our corporate environment) and our behavior are aggregate details essential in building a picture of what is happening within – and beyond – the corporate perimeter.

business people iconsTogether, they offer deep context to inform the security team of the appropriate response to potential threats and real attacks.

The critical piece in this approach is the security organization’s ability and capacity to understand the full scope of identity: who the person really is behind any given device and whether they are behaving abnormally.

This is particularly helpful when identifying attackers that have managed to acquire privileged user credentials.

Identifying Normal Behavior
One way to reduce the scope is to focus on the highest risk identities first. If you accept that the greatest risk comes from people inside your organization that can access sensitive information – known as “privileged users”, which can also include non-human accounts that may have access – then the correct steps are as follows:

1) Reduce the number of privileged users/identities and accounts.

2) Limit the privileges any one user has to systems and applications necessary to do their job.

3) Integrate the identities of privileged users into security and risk monitoring to spot behavior that may indicate a breach.

Closing the Gap
As more and more of the computing environment breaks outside of the control of central IT organizations, spearheaded by the move towards BYOD (or Bring Your Own Device), the ability to recognize who a user actually is and what is normal for them becomes a foundational part of effective security monitoring.

Without such identity-powered security, security teams will continue to struggle to differentiate whether the events they are monitoring are worth a reaction and that hesitation allows attackers to execute more and more damaging data breaches.

Furthermore, security teams will continue to operate in reactive mode and fail to prevent breaches or respond in a sufficiently timely way.

If identity is a central component to security management, then security teams will be in a better position to understand the behavior of users and will spend far less time trying to identify the meaning behind the events they are seeing.

People will continue to be our biggest point of exposure and with a keen focus on user behavior and activity, we will be in a much better position to limit the impact of breaches.

(Image Source: iCLIPART)

When Nature Strikes Part 2 – Fire In The Sky

Scott Blake is a Senior Network Engineer with Tech Experts.

Fires in or around server rooms and data centers can ruin your data and put your business at risk. It’s a must to set up fire protocols when you build your room or building.

As I mentioned in Part One of “When Nature Strikes,” the two most important protocols to have in place for any “in case of…” are 1) Have a Plan and 2) Secure Your Data. When dealing with the possibility of fire destroying your server room or data center, you’ll want to make sure you also have Suppression, Containment and Insurance protocols in place as well.

Have a Plan
Disaster recovery plans are now becoming a requirement for many industries. To be prepared, businesses need to locate and define the regulatory requirements of their individual industry, which will also help avoid fines, penalties or negative press associated with noncompliance.

Trying to implement or even design a plan while in the middle of a disaster will only lead to a less than successful recovery. Make sure your team is ready for action and everyone knows what to do. It’s better to be overprepared than have a plan that goes up in flames.

Secure Your Data
Back up your data regularly. Manage a duplicate copy of all data, programming, and company processes at a different physical location or in the cloud. That way, you can continue working at a secondary location if your system crashes. One way to do that is to keep copies of all your data, programs, bare metal backups and virtual machines in data centers in other states.

If you maintain data backups and business software on location, make sure you store them in a fire rated safe. Fire safes can be purchased anywhere from $100 to thousands of dollars for a fully-loaded safe.

Fire suppression systems for server rooms and data centers are essential to the server room itself. A fire suppression system will automatically extinguish a fire without the need of human intervention.

Design standards for fire suppression systems for server rooms and data centers are carried out with strict guidelines as the fire suppression agents used can be dangerous if not designed correctly. Fires within these types of environments are suppressed in two different ways.

Reduce Oxygen – This method uses argon, nitrogen and sometimes carbon monoxide to displace the oxygen in the room. The objective of this method is to reduce the oxygen level to below 15% in the room. By reducing oxygen to this level, it will suppress the fire.

Chemical and Synthetic – Most chemical and synthetic fire suppression agents have some form of a cooling mechanism. These systems use less gas and maintain a higher level of oxygen. However, high doses of any synthetic or chemical agent can be toxic, so making sure your design is correct is absolutely necessary. Synthetic fire suppression systems will deliver its payload within ten seconds.

A fire doesn’t have to be inside your data center to jeopardize IT equipment. Because radiant heat and smoke from fire in an adjacent room can be enough to damage sensitive network hardware, creating a protective barrier between your server room and the potential fire not only blocks indirect damage, but prevents flame spread as well.

Lightweight, flame-resistant ceramic panels can be used to build fire-safe archive rooms and data centers within larger, standard-construction buildings.

Recovering from fire damage is expensive. Business insurance is crucial and it’s not only for physical property. The right kind of insurance will replace lost income as well. Make sure your business insurance policy is up to date and has the correct coverage to support your business in crisis mode.

Make sure you have all of your suppression and containment systems built and installed by certified professionals. Insurance companies will require this in order for you to acquire the policy and even collect on it.

No one wants to get burned after a fire. Again, make sure your company insurance is up to date and has the appropriate coverage needed to rebuild your business.

If you have questions or you’re looking for suggestions on prepping your business for recovery, not disaster, call Tech Experts at (734) 457-5000.

(Image Source: iCLIPART)

Does VOIP Phone Service Make Sense For Your Business?

Thomas Fox is president of Tech Experts, southeast Michigan’s leading small business computer support company.

When we moved our office last month, part of the process included reviewing things like our telephone and Internet services.

Voice-over-Internet Protocol (VoIP) telephone service is basically technology that allows you to make and receive calls over data networks.

Instead of traditional phone services which channel analog signals such as the sound of your voice over copper wires, VoIP converts these sounds to digital form first—so that they can be sliced, diced, packaged, and routed over a digital network.

Because VoIP technology uses the same ideas behind data networking, and allows the use of the same networks used by computers, voice traffic can also be routed through the Internet as well.

Suddenly you can now dramatically reduce the cost of voice communications, as well as achieve creative combinations of both services to create new applications for use.

VOIP (voice over IP) services have really evolved over the past few years. In the past, I’ve been hesitant to use it because the service could be flaky – and since 90%+ of our business comes in by telephone, I wasn’t comfortable with something that wasn’t reliable.

Fortunately for small business owners, that’s changed. The service is now as reliable as service from the phone  company. And with the ubiquity of high speed Internet service, call quality has improved to the point of being indistinguishable from the old telephone network.

Our switch to VOIP provided two significant improvements over the service we used before.

First, we increased our telephone line capacity and coverage. We’ve added telephone numbers for our client’s in Toledo, Dundee, and the downriver area to be able to call us locally.

Second – and perhaps more importantly – we’ve cut our  telephone costs in half.

Cost and coverage were my primary concerns when looking at a move to VoIP services. Here are a few reasons you may want  to consider switching to VoIP for your office:

You can make and receive calls from multiple devices – for instance, on a dedicated phone, your PC via a software-based phone, or even a mobile phone with VoIP capabilities.

It’s easier to add extensions to your phone. You can provide a local number or extension for all your staff without  additional costs or cabling.

VoIP allows your employees to be more productive and  efficient by giving them the ability to receive and make  calls anywhere with a data connection.

You can use VoIP as a tool for real-time collaboration along with video conferencing, screen sharing, and digital white boarding.

You can potentially unify your communication channels,  streamlining communications and information management—for instance, marrying email with fax and voice in one inbox.

You can employ presence technologies that come standard with VoIP phones and VoIP communication systems. This technology can tell colleagues about your presence or give you info on the status and whereabouts of your staff.

This Website Can “Name That Tune”

Do you ever find yourself humming a song whose title, to your frustration, you don’t know or can’t remember? New search website Midomi ( is designed to actually identify that song for you in as little as 10 seconds.

Midomi allows people to search for a song by singing, humming or whistling a bit of the tune. The site then offers search results that include commercially recorded tracks or versions of the song recorded by others who have used the site. The technology also lets people listen to the exact section of each of the results that matched their voice sample.

People also can type in a song title or artist to get results. The system recognizes misspelled words.

Melodis, the company behind the site, has licensed 2 million digital tracks that can be purchased and has accumulated about 12,000 more from users. Users, who range from aspiring American Idol contestants to professionals, can create profiles and rate one other’s performances on the ad-supported site.

The underlying speech- and sound-recognition technology, dubbed Multimodal Adaptive Recognition System, or MARS, differs from similar technologies in that it looks at a variety of factors for recognizing samples, including pitch, tempo variation, speech content and location of pauses, said Chief Executive Keyvan Mohajer, who has a Ph.D. in sound- and speech-recognition from Stanford University.