Skip to main content

Gadget Wisdom

Category: Security & Networking

0 Responses

Lightweight Server Monitoring

Collectd Architecture Schematic

I recently began a move of the Gadget Wisdom and related sites to a new server. The purpose of this was laying the infrastructure for a major upgrade.

One of the major pushes was upgrading monitoring features. Some of the software being used was no longer being maintained, and replacements had to be found.

Nagios and Munin are two of the most popular tools used by IT specialists for infrastructure monitoring. There are good reasons that I opted for something more lightweight though. There are dozens of monitoring tools, and it is quite overwhelming to choose one. These are two that I have been happy with so far.

One of the first ones I installed is collectd. Collectd is a tool that stores performance data. It is plugin based, which means it can be used to pipe into a variety of different pieces of software. So, it is incredibly extensible, which leaves room for future data gathering and future output. It is also incredibly lightweight, which has its advantages.

To output the data into graphs, I’m using a simple front-end called Jarmon for now. Jarmon downloads the files generated by collectd, and renders them on the client side.

The second is a monitoring tool called monit. Monit monitors various services to ensure they are up, and can take action if they go down, such as sending an alert, restarting a service, executing a script, etc. One of the most fun things about having alerts is reading them…and in many cases, knowing I don’t have to do anything, because I told monit to do it for me.

There will be more to come on this, but what do you use in similar situations?

Published on June 26, 2013
Full Post
5 Responses

Thinking about RAID vs Backup

Six hard disk drives with cases opened showing...

The cost of storage hit a low the last time it was time for a storage upgrade. Then prices shot through the roof after a flood in Thailand closed factories.

This shut down all of my hard drives purchases for over two years. When I emerged from my cocoon, Samsung was gone as a Hard Drive manufacturer…and I had bought many Samsung OEM hard drives.

The purpose of RAID in a redundant system is to protect against hardware failure. You have different levels of RAID for this, RAID 1 for just a straight mirror, and RAID 5 and 6, which involve a minimum of 3-4 drives to accomplish.

RAID is important if you care about uptime. If you can afford to be down for a bit, backups are a better choice.

What is being stored, in this case, consists of several categories: Video, Music, Documents, Configuration Files. There is no point in storing complete drive images. The OS can be reinstalled, and it probably will be better off and cleaner running after it is. The OS drive on all of the systems I’ve built or refurbed in the last two years is an SSD, which is a common practice nowadays.

I had been mulling this after reading an article on another hardware refresh by Adam Williamson. He hadn’t refreshed in seven and a half years and used a separate NAS and server. So, why refresh after only two and a half years? Partly it was due to mistakes.

I’d been using WD Green drives. These had several limitations. They park the head after only 8 seconds of inactivity, which increased the load cycle count. The WD Red Drive is designed for 24/7 operation in network attached storage, with a longer warranty, and I now have two 3TB drives. The only other alternative in WD’s stable was a Black drive, their performance drive. It might be time to consider a Seagate, the main competitor, as well.

The warranty situation in hard drives now continues to drop. Five years, down to thee, and down to two years. So there is less protection from the manufacturer and less inclination to create quality products. That was why we were buying OEM over Consumer Drives over the last few years.

Back to the subject at hand…why not a RAID? It is simply a matter of cost vs. benefit. This is terabytes of video data, mostly a DVD archive I intend to create by backing up my DVD collection to MKV. If it were lost, the original copies aren’t going anywhere. But, more importantly, cloud backup is impractical.

Using Amazon S3, for example, at a rate of 9.5 cents a GB, that is just under $100 a month per TB. Amazon Glacier, which is their long-term backup option, is 1 cent a GB, or roughly $10 a TB. But once you take video out of the equation, or sharply reduce it, budgeting $5 a month for important data is a reasonable amount, and still gets you a lot of storage options to work with.

So, to ensure redundancy, there is a second drive in the system, and backups will be done to it. From there, the backups of everything but the video store will be sent up to the cloud. As I’ve mostly given up buying DVDs(due to Blu-Ray), the collection should be fairly static.

Back to Adam Williamson, he had a great idea of having the other computers on the network back up their data to the server, independently isolated by each machine having a separate user account on the server. Not quite there yet, but sounds good. I have other plans to download data from my cloud service providers(Google, Dropbox, etc., and maintain a local backup, but that is a longer-term project. I’m reasonably certain in the interim, Google has a better backup system then I do.

What about off-site then? I still have the old 1TB Green Drives. They can be run through diagnostics, loaded up as a backup, and sent off to a relative’s house…I’ve added a hard drive dock through an E-SATA port to support this.

So in the end, RAID wasn’t necessary for me, but some redundancy was. It may be for you. Comments?

More to come…

Published on April 22, 2013
Full Post
0 Responses

Feed Changes

English: This icon, known as the "feed ic...

To All RSS Subscribers:

Due to the recent uncertainty regarding the future of Feedburner, we are removing all redirects to Feedburner. All links on the site will now use local feeds. If possible, please update your subscriptions.

If not, the Feedburner feeds will continue to be maintained for as long as Google continues to offer the service, but we feel that self-hosting all feeds is the more prudent long-term move.

Feed: http://www.gadgetwisdom.com/feed/

Published on October 7, 2012
Full Post
0 Responses

Amazon Glacier for the Home User

 

Backup Backup Backup - And Test Restores

Earlier this week, Amazon announced Glacier, which is long-term storage that costs one cent a gigabyte per month. This compares to the 12 cents a gigabyte per month for S3. The basic difference is that Glacier can take between 3 and 5 hours to retrieve data, and S3 is instantaneous.

Amazon S3 is a durable, secure, simple, and fast storage service designed to make web-scale computing easier for developers. Use Amazon S3 if you need low latency or frequent access to your data. Use Amazon Glacier if low storage cost is paramount, your data is rarely retrieved, and data retrieval times of several hours are acceptable.

But, let’s go to the pricing. As a home user, we’re assuming you have less than 50TB.

  • Storage
    • Glacier – 0.01 per GB/month
    • S3 – 0.12 per GB/month
  • Data Transfers In – Free on All
  • Data Transfer Out - Glacier and S3 both use the same pricing.
    • 1st GB free
    • Next 10GB, 0.12 a GB
    • Next 40GB, 0.09 a GB
  • Requests
    • Glacier
      • Data Retrievals are Free, however, Glacier is designed with the expectation that retrievals are infrequent and unusual, and data will be stored for extended periods of time. You can retrieve up to 5% of your average monthly storage (pro-rated daily) for free each month.
      • If you choose to retrieve more than this amount of data in a month, you are charged a retrieval fee starting at $0.01 per gigabyte. Learn more. In addition, there is a pro-rated charge of $0.03 per gigabyte for items deleted prior to 90 days

Amazon has promised that there will be an upcoming feature to export from S3 to Glacier based on data lifecycle policies. The details on how this will work aren’t 100% available, but we could imagine offloading from S3 to Glacier based on age. So, you keep the last 1-2 months of data on S3, and the older backups on Glacier. It would allow you to save a good deal of money for backups.

Not everyone, for that matter, needs high availability…especially if you are keeping something that is infrequently modified. For example, the family photo album. You can keep your local backups, and for 1 cent a month, you get a copy that you can access in an emergency.

What we’re missing is that many reports indicated that retrieval is potentially costly. But we found it equivalent to S3, only slower.

But, what would you use this for? We’d like to hear your thoughts.

Published on August 25, 2012
Full Post
1 Response

Mandatory PSA: Secure Your Digital Life

The KeePass Password Safe icon.

Every tech pundit out there has been talking about the heartbreaking story of Mat Honan of Wired and how hackers used social engineering to gain access to one of his accounts, and the chain reaction results.

One of Honan’s problems stemmed from how his accounts were daisy-chained together. `The recovery email for one account led to another, account names on different networks were consistent, etc. Figuring out how to mitigate this requires some thought. We have multiple email accounts, and it will probably require some diagramming and planning to figure everything out there.

Then there are passwords. We admit to people all the time that we don’t even know half our passwords. We use a two-pronged attack on this. One is the open-source, multi-platform app KeePass. KeePass offers a password vault stored as a file, encrypted using a single Master Password. All of the passwords in it are generated by the program and impossible for most people to remember.

We also use Lastpass as a service. Lastpass has a plugin for every browser, offers one click login, form filling, and more. The basic service is free, but the premium version adds mobile support and additional features. We’re not using half of the options that it offers, even with the $12 a year we give them for premium.

But, as part of a redundant philosophy, you should have your most important passwords in multiple locations. Also, having passwords even you don’t know in vault means you can easily change your credentials regularly for individual sites, should you choose to. do so.

Two factor authentication, although it could be a bit more user friendly, is enabled for all Google accounts and Lastpass. This is not a challenge for hackers to hack. There’s nothing very interesting there anyway.

In security, the mantra is trust no one. Try to walk the line between paranoia and rationality very carefully.

The second issue is backup. This is an area where we could be better. We have a backup plan that needs to be upgraded. We have various cloud backup solutions, and a few local ones. They need to be unified. We’ll get back to this in a future post, once we create a checklist.

But, for those of you out there, let’s cover a few basics. Periodically, extract your online data and store a copy somewhere, both locally and remotely, in addition to your cloud storage. Try a relative’s house. The likelihood of you and your relative both suffering calamities is probably slim. Remember that sending your data to a remote drive and deleting your original copy is an archive, not a backup.

Make a plan, automate as much as possible, because manual action is so easy to get behind on.

So, backup, secure your accounts, do some planning…we’ll be back with more. Consider yourself warned.

Published on August 12, 2012
Full Post
1 Response

Thinking about Dual Band Routers

RADIO FREQUENCY ENVIRONMENT AREA
RADIO FREQUENCY ENVIRONMENT AREA (Photo credit: elycefeliz)

Wireless-G has been the established standard for the last few years. We remember when we started playing with Wireless-B. It was only recently we jumped to Wireless-N. We didn’t need the speed jump.

With the increasing crowding of wireless spectrum, gigabit wired networks, where possible, are probably a good move.

We jumped this past month to dual band Wireless-N because of of the 5GHz frequency it offered. Wi-fi usually operates at 2.4GHz, but N supports two different frequency ranges.

Very few devices take advantage of the 5GHz band, which means that there will be little interference. Living in a city, there are at least 16 2.4GHz wireless networks in range of our test device.

Dual Band routers offer antennas for both frequencies, which means that you can have the devices that do not support 5GHz still operate.

After much consideration, we overbuilt and purchased the WNDR4500 when it was on sale.

[asa]B005KG44V0[/asa]

The router offers speed and reliability for the price, as well as multiple simultaneous full speed connections, guest networking, file sharing, and more. We needed the extra speed after we upgraded to wideband. The router had to keep up with the increased throughput.

This isn’t a router review. It is the most expensive router we have ever purchased. But if house networking is important to you, your router should be too. And if you are concerned about interference from other access points, upgrading to the 5GHz band is a viable option.

[asa]B0036BJN12[/asa]

The cost of a new Intel wireless mini-pci card is not prohibitive either. Most of these cards are easily accessible on a laptop, making it a simple upgrade.

But what do you think? Is less interference worth it? Do you care about the possible 450mbps throughput? What would be your rationale for going with a high-end router?

Published on April 2, 2012
Full Post
0 Responses

Urgent: Change your Wireless Security Settings

Linksys WAP54G 802.
Image via Wikipedia

Crunchgear reports today that researchers have developed an attack against WPA Encryption when using the TKIP protocol.

If you haven’t already, change your wireless access point security settings to the AES Protocol, or switch to WPA2 to stay one step ahead of them. Or, if you are out and about, and cannot do so, consider using SSH Tunneling or  VPN to encrypt your connection a second time.

Reblog this post [with Zemanta]
Published on August 27, 2009
Full Post
0 Responses

Wiring Project – Part 1

my First Cable Modem
Image by lerxst / boycat via Flickr

Recently, the excess of computer wires hooking together our systems and unsuccessful attempts to get them into order frustrated us.

So, with a significant investment of time, and the cost of some organizational tools, ie some new cables, cable tacks, velcro ties, etc., we’re going to try and tackle this issue.

Our first plan involves a redo of our networking appliances. That includes the DSL/cable modem, the router, and a gigabit switch. All these items belong together, as they all serve to form the house network. But currently, they terminate behind and under a desk in an ugly mess.

In comes our current love affair with Swedish furniture. It isn’t too expensive, and it looks decent for the price we can afford. We recently replaced some old bookshelves with some Billy Bookshelves from Ikea, nice because of three features: a curve at the bottom allowing the shelves to press flat on the wall without removing the baseboard, the fact the backing slides into a groove to hold it in place, rather than merely being nailed, and the fact that you can buy height extensions and build your bookcases up to the ceiling.

For the first stage of our project, we decided to build in inobtrusive network wiring rack into an endtable. Endtables are nice in that they are small. We were concerned about ventilation, so we ended up planning on using an Eina table, designed as a side table for a bed. It is made of particleboard, but it is thick and stable. The item offers optional casters for rolling around. Importantly for us, it is open on both sides.

We haven’t yet finished preplanning for this reconstruction, as we attribute our failure to succeed in organization projects like this in the past due to lack of preplanning. But our plan is to install the equipment and place the item under our large computer desk, as many people place rolling filing cabinets, with the wood face facing outward, as opposed to what would be seen normally, the open faced sides. This only leaning down will allow the components to be seen.

All cables will be tied and secured to the cart in such a way the cart can easily be disconnected and moved for cleaning and maintenance. The room where these cables are located borders another room that needs network access, so we’ve built an in-wall patch panel. It is a simple project. Two keystone wallplates with network jacks on them connected by a short piece of wire, so a wire plugged into identical jacks on each side will act as a single coupled-together wire. Since keystone jacks are modular, we can add extra cables, network or otherwise, as needed.

The patch jack, we should probably call it, sits behind the table under the desk. And we plan to put all of the wires going into it from the network table into some flexible split-loom tubing, which will further hide it, giving it a more professional feel, although it does admittedly limit redesign, as removing the cables from the tubing is an annoyance…which is why pre-planning is so important.

We’ll update you on this with pictures as it develops. Also on the design block, a bedroom HTPC installation plan designed to do some of the same thing.

Reblog this post [with Zemanta]
Published on December 9, 2008
Full Post
0 Responses

Increasing Wireless Security Now that WPA is Cracked

KeePass Password Safe
Image via Wikipedia

Early on, wireless networks were encrypted using WEP encryption, until it was discovered even the FBI could crack that in a minute.

Then came WPA, which was supposedly much more secure. However, researchers have figured out a way to break the TKIP key in about 12 to 15 minutes. Experts had known that a brute force dictionary attack could eventually break such a key, however, it was not efficient.

Researchers discovered a way to trick a router into sending them larger amounts of data. More data allows them to break the key much more easily using new mathematical techniques. The technique has already been incorporated into popular Wireless sniffer program Aircrack-ng. The newer WPA2 is considered safe from this attack.

That is the simplest way to increase security. If your router has a WPA2 capability, update to it from WPA. If it doesn’t, check to see if there is an upgraded firmware, or consider switching to a custom firmware if one is available to you that may have it. We like DD-WRT, which has ports for many routers(List of supported devices). When you set WPA2, switch from TKIP encryption to AES only. It hasn’t yet been cracked.

WPA and WPA2 for home use use a passphrase to access the network. Recommendations are that this passphrase be at least 13 characters and not consist of any dictionary word. Too many people, both in securing their networks and other passwords, use weak passwords. Remember, you don’t have to remember the thing. Your computer can do that. You can keep it in a secure file, or a password vault such as KeePass.

Do not set your wireless SSID to anything commonly used. A list of the top 1000 most commonly used ones can be found here. Top ones to avoid: linksys, default, NETGEAR, Belkin54g, Wireless, hpsetup, WLAN, Actiontec, smc, Dlink…All of these give unnecessary information about what type of router you have, as many of these are default ssids. It also tells a malicious individual you may be vulnerable.

There is also MAC filtering, which is touted as a security measure as well. A MAC address is unique to a specific piece of hardware. But since they can be spoofed, this is more of a deterrent than anything else.

Other useful features include AP Isolation. This is a feature available on many routers that disables connections from the wireless device to any devices on the network. It would ensure that a wireless computer could access only the internet, not the internal network. However, if you want to access your internal network, this is not as useful.

If you want to go to Enterprise-level WPA2, you can certainly do so. But it is usually overly complex for a simple home installation and requires an external RADIUS server, which you would have to run. If you have an always-on computer around, this might be an option.

As a final measure, you can always just give up. Bruce Schneier, security guru, runs an open wireless network, and outlines his reasons why, and links to much commentary on the subject in this post. The question of whether or not one should have open wireless is a different one than security. We will say that you can always run a secure network independently of your insecure one, and run security independently of your wireless, or if you are technically proficient, have a gateway portal the way hotels do, that secures your connection.

For example, you can use VPN to connect to your private network for security and route all communications through it. Thus it is encrypted before it leaves your computer for the network. Many businesses use this technique for individuals accessing their files remotely.

Reblog this post [with Zemanta]
Published on November 12, 2008
Full Post
0 Responses

Running a Network Server without a Computer

We recently pulled out the Linksys Network Storage Link USB 2.0, aka the NSLU2, affectionately nicknamed the SLUG by enthusiasts. The NSLU2 is actually a Linux-based device, which runs SAMBA, a version of Windows file sharing and has been hacked to run other things.

The NSLU2 is not your only choice for hacking in this manner. You can use the Synology DS101, the Iomega NAS100D, the D-Link DSMG600, or any device that uses the ixp4xxx chipset with attached storage. However, the NSLU2 has the most following, having had an established community for a long time.

There are several options for replacement firmware for the NSLU2. There is Unslung, which allows you to expand the ffunctionality of the NSLU2 without using the original product functionality and compatibility with the original Linksys firmware. However, unlike the stock firmware, Unslung contains support for NTFS(The formatting used for Windows drives), card readers, USB hubs to add extra devices, and other enhancements. Other packages can be installed to a drive hooked into the Slug as it has limited memory and run, for example, a streaming media server.

Alternatives to Unslung include OpenSlug and Debian for the NSLU2, which remove the Linksys functionality in favor of a complete Linux system, and thus are not for the neophyte.

For more information on the various aspects of the NSLU2, visit its unofficial homepage/wiki. We just set one up as a file server at a remote location. It will, as soon as we finish setting up the software, not only backup files from the main server, but allow users at the second site to access local copies of their documents.

Published on April 5, 2007
Full Post

Get New Posts By Email