Software Usage License Agreement
Copyright (C) [year] [copyright holders]
This software, it's code, or any fan fiction resulting from its enormous popularity, is provided "AS IS". That means that the software included with with this license is provided without warranty, regardless of any previous verbal contract you may have swindled out of the author with your fast words on the phone...
To the maximum extent permitted by law, the author of this software disclaims all liability for any damages, lost profits, lost socks, lost puppies, hair loss, or architectural fanaticism and cults that may occur as a result of using this software. Under no circumstances shall the author of this software, nor his pet cat 'Mr.Whiskers', be held legally, financially or morally liable for any claim of damages or liability resulting from the use of, abstinence from, ritualistic worshiping of, or illegal pirating of this software. This includes, but is not limited to, lost profits, stolen data, bankruptcy, autonomous homicidal cyborgs or man-eating miniature pony attacks.
Use at your own risk! This software comes with no guarantees of fitness for any purpose, so do not use it for the back-end of your fortune 500 business without at least hiring me first.
Tags
C#
Math
Algorithm
Data
Error handling
Statistics
Cryptography
hack
Best practices
Architecture
CSV
DataTable
Encryption
SQL
Database
Humor
snippets
Entropy
ORM
Design Pattern
Dictionary
Jokes
Prime
Pseudorandom
Serialization
Code Generation
Compression
Coprime
Exception
Object Relational Mapping
Raspberry Pi
Reflection
Winform
Attributes
CodeDOM
Console
DataGrid
DataGridView
DataSet
Drawing
GCD
List
Mono
PropertyInfo
Sort
Static Analysis
Validation
XML
Benchmarking
Bugs
Bézier
Cellular automata
Chrome
Clipboard
Data Mapper
Designer
EULA
Extension Methods
Firefox
Games
Generics
GraphicsPath
HoneyPot
HowJSay
IComparer
IE
KeyedCollection
Luhn Algorithm
Parameter Sanitization
Query
Rogue-like
SortedDictionary
UnhandledExceptionHandler
Usercontrol
WIN32API
Wednesday, July 31, 2013
Saturday, July 27, 2013
Information entropy and data compression
In my last post, I talked about Shannon data entropy and showed a class to calculate that. Lets take it one step further and actually compress some data based off the data entropy we calculated.
To do this, first we calculate how many bits are needed to compress each byte of our data. Theoretically, this is the data entropy, rounded up to the next whole number (Math.Ceiling). But this is not always the case, and the number of unique symbols in our data may be a number that is too large to be represented in that many number of bits. We calculate the number of bits needed to represent the number of unique symbols by getting its Base2 logarithm. This returns a decimal (double), so we use Math.Ceiling to round to up to the nearest whole number as well. We set entropy_Ceiling to which ever number is larger. If the entropy_Ceiling is 8, then we should immediately return, as we cannot compress the data any further.
We start by making a compression and decompression dictionary. We make these by taking the sorted distribution dictionary (DataEntropyUTF8.GetSortedDistribution) and start assigning X-bit-length value to each entry in the sorted distribution dictionary, with X being entropy_Ceiling. The compression dictionary has a byte as the key and an array of bool (bool[]) as the value, while the decompression dictionary has an array of bool as the key, and a byte as a value. You'll notice in the decompression dictionary we store the array of bool as a string, as using an actual array as a key will not work, as the dictionary's EqualityComparer will not assign the same hash code for two arrays of the same values.
Then, compression is as easy as reading each byte, and getting the value from the compression dictionary for that byte and adding it to a list of bool (List
Decompression consists of converting the compressed array of bytes into an array of bool, then reading in X bools at a time and getting the byte value from the decompression library, again with X being entropy_Ceiling.
But first, to make this process easier, and to make our code more manageable and readable, I define several extension methods to help us out, since .NET provides almost no support for working with data on the bit level, besides the BitArray class. Here are the extension methods that to make working with bits easier:
public static class BitExtentionMethods
{
//
// List<bool> extention methods
//
public static List<bool> ToBitList(this byte source)
{
List<bool> temp = ( new BitArray(source.ToArray()) ).ToList();
temp.Reverse();
return temp;
}
public static List<bool> ToBitList(this byte source,int startIndex)
{
if(startIndex<0 || startIndex>7) {
return new List<bool>();
}
return source.ToBitList().GetRange(startIndex,(8-startIndex));
}
//
// bool[] extention methods
//
public static string GetString(this bool[] source)
{
string result = string.Empty;
foreach(bool b in source)
{
if(b) {
result += "1";
} else {
result += "0";
}
}
return result;
}
public static bool[] ToBitArray(this byte source,int MaxLength)
{
List<bool> temp = source.ToBitList(8-MaxLength);
return temp.ToArray();
}
public static bool[] ToBitArray(this byte source)
{
return source.ToBitList().ToArray();
}
//
// BYTE extention methods
//
public static byte[] ToArray(this byte source)
{
List<byte> result = new List<byte>();
result.Add(source);
return result.ToArray();
}
//
// BITARRAY extention methods
//
public static List<bool> ToList(this BitArray source)
{
List<bool> result = new List<bool>();
foreach(bool bit in source)
{
result.Add(bit);
}
return result;
}
public static bool[] ToArray(this BitArray source)
{
return ToList(source).ToArray();
}
}
Remember, these need to be the base class in a namespace, not in a nested class.Now, we are free to write our compression/decompression class:
public class BitCompression
{
// Data to encode
byte[] data;
// Compressed data
byte[] encodeData;
// # of bits needed to represent data
int encodeLength_Bits;
// Original size before padding. Decompressed data will be truncated to this length.
int decodeLength_Bits;
// Bits needed to represent each byte (entropy rounded up to nearist whole number)
int entropy_Ceiling;
// Data entropy class
DataEntropyUTF8 fileEntropy;
// Stores the compressed symbol table
Dictionary<byte,bool[]> compressionLibrary;
Dictionary<string,byte> decompressionLibrary;
void GenerateLibrary()
{
byte[] distTable = new byte[fileEntropy.Distribution.Keys.Count];
fileEntropy.Distribution.Keys.CopyTo(distTable,0);
byte bitSymbol = 0x0;
bool[] bitBuffer = new bool[entropy_Ceiling];
foreach(byte symbol in distTable)
{
bitBuffer = bitSymbol.ToBitArray(entropy_Ceiling);
compressionLibrary.Add(symbol,bitBuffer);
decompressionLibrary.Add(bitBuffer.GetString(),symbol);
bitSymbol++;
}
}
public byte[] Compress()
{
// Error checking
if(entropy_Ceiling>7 || entropy_Ceiling<1) {
return data;
}
// Compress data using compressionLibrar
List<bool> compressedBits = new List<bool>();
foreach(byte bite in data) { // Take each byte, find the matching bit array in the dictionary
compressedBits.AddRange(compressionLibrary[bite]);
}
decodeLength_Bits = compressedBits.Count;
// Pad to fill last byte
while(compressedBits.Count % 8 != 0) {
compressedBits.Add(false); // Pad to the nearest byte
}
encodeLength_Bits = compressedBits.Count;
// Convert from array of bits to array of bytes
List<byte> result = new List<byte>();
int count = 0;
int shift = 0;
int offset= 0;
int stop = 0;
byte current = 0;
do
{
stop = encodeLength_Bits - count;
stop = 8 - stop;
if(stop<0) {
stop = 0;
}
if(stop<8)
{
shift = 7;
offset = count;
current = 0;
while(shift>=stop)
{
current |= (byte)(Convert.ToByte(compressedBits[offset]) << shift);
shift--;
offset++;
}
result.Add(current);
count += 8;
}
} while(count < encodeLength_Bits);
encodeData = result.ToArray();
return encodeData;
}
public byte[] Decompress(byte[] compressedData)
{
// Error check
if(compressedData.Length<1) {
return null;
}
// Convert to bit array for decompressing
List<bool> bitArray = new List<bool>();
foreach(byte bite in compressedData) {
bitArray.AddRange(bite.ToBitList());
}
// Truncate to original size, removes padding for byte array
int diff = bitArray.Count-decodeLength_Bits;
if(diff>0) {
bitArray.RemoveRange(decodeLength_Bits-1,diff);
}
// Decompress
List<byte> result = new List<byte>();
int count = 0;
do
{
bool[] word = bitArray.GetRange(count,entropy_Ceiling).ToArray();
result.Add(decompressionLibrary[word.GetString()]);
count+=entropy_Ceiling;
} while(count < bitArray.Count);
return result.ToArray();
}
public BitCompression(string filename)
{
compressionLibrary = new Dictionary<byte, bool[]>();
decompressionLibrary = new Dictionary<string, byte>();
if(!File.Exists(filename)) {
return;
}
data = File.ReadAllBytes(filename);
fileEntropy = new DataEntropyUTF8();
fileEntropy.ExamineChunk(data);
int unique = (int)Math.Ceiling(Math.Log((double)fileEntropy.UniqueSymbols,2f));
int entropy = (int)Math.Ceiling(fileEntropy.Entropy);
entropy_Ceiling = Math.Max(unique,entropy);
encodeLength_Bits = data.Length * entropy_Ceiling;
GenerateLibrary();
}
}
Please feel free to comment with ideas, suggestions or corrections.
Labels:
.net,
C#,
Compression,
Cryptography,
csharp,
Dictionary,
Encryption,
Entropy,
Extension Methods,
Information,
Math,
security
Monday, July 22, 2013
Information Shannon Entropy
Shannon/data entropy is a measurement of uncertainty. Entropy can be used as a measure of randomness. Data entropy is typically expressed as the number of bits needed to encode or represent data. In the example below, we are working with bytes, so the max entropy for a stream of bytes is 8.
A file with high entropy means that each symbol is more-or-less equally as likely to appear next. If a file or file stream has high entropy, it is either probably compressed, encrypted or random. This can be used to detect packed executables, cipher streams on a network, or a breakdown of encrypted communication on a network that is expected to be always encrypted.
A text file will have low entropy. If a file has low data entropy, it mean that the file will compress well.
This post and code was inspired by Mike Schiffman's excelent explaination of data entropy on his Cisco Security Blog.
Here is what I wrote:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace DataEntropy
{
public class DataEntropyUTF8
{
// Stores the number of times each symbol appears
SortedList<byte,int> distributionDict;
// Stores the entropy for each character
SortedList<byte,double> probabilityDict;
// Stores the last calculated entropy
double overalEntropy;
// Used for preventing unnecessary processing
bool isDirty;
// Bytes of data processed
int dataSize;
public int DataSampleSize
{
get { return dataSize; }
private set { dataSize = value; }
}
public int UniqueSymbols
{
get { return distributionDict.Count; }
}
public double Entropy
{
get { return GetEntropy(); }
}
public Dictionary<byte,int> Distribution
{
get { return GetSortedDistribution(); }
}
public Dictionary<byte,double> Probability
{
get { return GetSortedProbability(); }
}
public byte GetGreatestDistribution()
{
return distributionDict.Keys[0];
}
public byte GetGreatestProbability()
{
return probabilityDict.Keys[0];
}
public double GetSymbolDistribution(byte symbol)
{
return distributionDict[symbol];
}
public double GetSymbolEntropy(byte symbol)
{
return probabilityDict[symbol];
}
Dictionary<byte,int> GetSortedDistribution()
{
List<Tuple<int,byte>> entryList = new List<Tuple<int, byte>>();
foreach(KeyValuePair<byte,int> entry in distributionDict)
{
entryList.Add(new Tuple<int,byte>(entry.Value,entry.Key));
}
entryList.Sort();
entryList.Reverse();
Dictionary<byte,int> result = new Dictionary<byte, int>();
foreach(Tuple<int,byte> entry in entryList)
{
result.Add(entry.Item2,entry.Item1);
}
return result;
}
Dictionary<byte,double> GetSortedProbability()
{
List<Tuple<double,byte>> entryList = new List<Tuple<double,byte>>();
foreach(KeyValuePair<byte,double> entry in probabilityDict)
{
entryList.Add(new Tuple<double,byte>(entry.Value,entry.Key));
}
entryList.Sort();
entryList.Reverse();
Dictionary<byte,double> result = new Dictionary<byte,double>();
foreach(Tuple<double,byte> entry in entryList)
{
result.Add(entry.Item2,entry.Item1);
}
return result;
}
double GetEntropy()
{
// If nothing has changed, dont recalculate
if(!isDirty) {
return overalEntropy;
}
// Reset values
overalEntropy = 0;
probabilityDict = new SortedList<byte,double>();
foreach(KeyValuePair<byte,int> entry in distributionDict)
{
// Probability = Freq of symbol / # symbols examined thus far
probabilityDict.Add(
entry.Key,
(double)distributionDict[entry.Key] / (double)dataSize
);
}
foreach(KeyValuePair<byte,double> entry in probabilityDict)
{
// Entropy = probability * Log2(1/probability)
overalEntropy += entry.Value * Math.Log((1/entry.Value),2);
}
isDirty = false;
return overalEntropy;
}
public void ExamineChunk(byte[] chunk)
{
if(chunk.Length<1 || chunk==null) {
return;
}
isDirty = true;
dataSize += chunk.Length;
foreach(byte bite in chunk)
{
if(!distributionDict.ContainsKey(bite))
{
distributionDict.Add(bite,1);
continue;
}
distributionDict[bite]++;
}
}
public void ExamineChunk(string chunk)
{
ExamineChunk(StringToByteArray(chunk));
}
byte[] StringToByteArray(string inputString)
{
char[] c = inputString.ToCharArray();
IEnumerable<byte> b = c.Cast<byte>();
return b.ToArray();
}
void Clear()
{
isDirty = true;
overalEntropy = 0;
dataSize = 0;
distributionDict = new SortedList<byte, int>();
probabilityDict = new SortedList<byte, double>();
}
public DataEntropyUTF8(string fileName)
{
this.Clear();
if(File.Exists(fileName))
{
ExamineChunk( File.ReadAllBytes(fileName) );
GetEntropy();
GetSortedDistribution();
}
}
public DataEntropyUTF8()
{
this.Clear();
}
}
}
Labels:
.net,
C#,
Compression,
Cryptography,
csharp,
Data,
Encryption,
Entropy,
Information,
Math,
security
Sunday, July 21, 2013
C# developer humor - Chuck Norris.
These jokes were inspired by this link. I have modified them a bit to apply to the C or C# language.
Here are some originals:
- Chuck Norris can make a class that is both abstract and constant.
- Chuck Norris serializes objects straight into human skulls.
- Chuck Norris doesn’t deploy web applications, he roundhouse kicks them into the server.
- Chuck Norris always uses his own design patterns, and his favorite is the Roundhouse Kick.
- Chuck Norris always programs using unsafe code.
- Chuck Norris only enumerates roundhouse kicks to the face.
- Chuck Norris demonstrated the meaning of float.PositiveInfinity by counting to it, twice.
- A lock statement doesn’t protect against Chuck Norris, if he wants the object, he takes it.
- Chuck Norris doesn’t use VisualStudio, he codes .NET by using a hex editor on the MSIL.
- When someone attempts to use one of Chuck Norris’ deprecated methods, they automatically get a roundhouse kick to the face at compile time.
- Chuck Norris never has a bug in his code, without exception!
- Chuck Norris doesn’t write code. He stares at a computer screen until he gets the progam he wants.
- Code runs faster when Chuck Norris watches it.
- Chuck Norris methods don't catch exceptions because no one has the guts to throw any at them.
- Chuck Norris will cast a value to any type, just by staring at it.
- If you catch { } a ChuckNorrisException, you’ll probably die.
- Chuck Norris’s code can roundhouse kick all other classes' privates.
- C#'s visibility levels are public, private, protected, and “protected by Chuck Norris”. Don’t try to access a field with this last modifier!
- Chuck Norris can divide by 0!
- The garbage collector only runs on Chuck Norris code to collect the bodies.
- Chuck Norris can execute 64bit length instructions in a 32bit CPU.
- To Chuck Norris, all other classes are IDisposable.
- Chuck Norris can do multiple inheritance in C#.
- MSBuild never throws exceptions to Chuck Norris, not anymore. 753 killed Microsoft engineers is enough.
- Chuck Norris doesn’t need unit tests, because his code always work. ALWAYS.
- Chuck Norris has been coding in generics since 1.1.
- Chuck Norris’ classes can’t be decompiled... don’t bother trying.
Here are some originals:
- If you try derive from a Chuck Norris Interface, you'll only get an IRoundhouseKick in-the-face.
- Chuck Norris can serialize a dictionary to XML without implementing IXMLSerializable.
- Chuck Norris can decompile your assembly by only reading the MSIL.
Tuesday, July 16, 2013
Convert a Class or List of Class to a DataTable, using reflection.
Note by author:
Since writing this, I have expanded on this idea quite a bit. I have written a lightweight ORM class library that I call EntityJustWorks.
The full project can be found on GitHub or CodePlex.
EntityJustWorks not only goes from a class to DataTable (below), but also provides:
- SQL 'SELECT' statement to a List<T> of populated classes, each one resembling a row
Security Warning:
This library generates dynamic SQL, and has functions that generate SQL and then immediately executes it. While it its true that all strings funnel through the function Helper.EscapeSingleQuotes, this can be defeated in various ways and only parameterized SQL should be considered SAFE. If you have no need for them, I recommend stripping semicolons ; and dashes --. Also there are some Unicode characters that can be interpreted as a single quote or may be converted to one when changing encodings. Additionally, there are Unicode characters that can crash .NET code, but mainly controls (think TextBox). You almost certainly should impose a white list:
string clean = new string(dirty.Where(c => "abcdefghijklmnopqrstuvwxyz0123456789.,\"_ !@".Contains(c)).ToArray());
PLEASE USE the SQLScript.StoredProcedure and DatabaseQuery.StoredProcedure classes to generate SQL for you, as the scripts it produces is parameterized. All of the functions can be altered to generate parameterized instead of sanitized scripts. Ever since people have started using this, I have been maintaining backwards compatibility. However, I may break this in the future, as I do not wish to teach one who is learning dangerous/bad habits. This project is a few years old, and its already showing its age. What is probably needed here is a total re-write, deprecating this version while keep it available for legacy users after slapping big warnings all over the place. This project was designed to generate the SQL scripts for standing up a database for a project, using only MY input as data. This project was never designed to process a USER'S input.! Even if the data isn't coming from an adversary, client/user/manually entered data is notoriously inconsistent. Please do not use this code on any input that did not come from you, without first implementing parameterization. Again, please see the SQLScript.StoredProcedure class for inspiration on how to do that.
This class uses generics to accepts a class type, and uses reflection to determine the name and type of the class's public properties. With that, a new DataTable is made and the DataColumnCollection is fleshed out. Then you can add rows to the DataTable by passing instances of the class with it's property fields containing values.
Finally, we serialize the DataTable to an XML file, save it's Schema, then load it all back in again as a proof of concept.
Usage example:
List<Order> orders = new List<Order>();
// Fill in orders here ...
// orders.Add(new Order());
// Convert class to DataTable
DataTable ordersTable = ClassListToDataTable(orders);
// Set DataGrid's DataSource to DataTable
dataGrid1.DataSource = ordersTable;
Here is the Code:
public static DataTable ClassToDataTable<T>() where T : class
{
Type classType = typeof(T);
List<PropertyInfo> propertyList = classType.GetProperties().ToList();
if (propertyList.Count < 1)
{
return new DataTable();
}
string className = classType.UnderlyingSystemType.Name;
DataTable result = new DataTable(className);
foreach (PropertyInfo property in propertyList)
{
DataColumn col = new DataColumn();
col.ColumnName = property.Name;
Type dataType = property.PropertyType;
if (IsNullable(dataType))
{
if(dataType.IsGenericType)
{
dataType = dataType.GenericTypeArguments.FirstOrDefault();
}
}
else
{ // True by default
col.AllowDBNull = false;
}
col.DataType = dataType;
result.Columns.Add(col);
}
return result;
}
public static DataTable ClassListToDataTable<T>(List<T> ClassList) where T : class
{
DataTable result = ClassToDataTable<T>();
if(result.Columns.Count < 1)
{
return new DataTable();
}
if(ClassList.Count < 1)
{
return result;
}
foreach(T item in ClassList)
{
ClassToDataRow(ref result, item);
}
return result;
}
public static void ClassToDataRow<T>(ref DataTable Table, T Data) where T : class
{
Type classType = typeof(T);
string className = classType.UnderlyingSystemType.Name;
// Checks that the table name matches the name of the class.
// There is not required, and it may be desirable to disable this check.
// Comment this out or add a boolean to the parameters to disable this check.
if (!Table.TableName.Equals(className))
{
return;
}
DataRow row = Table.NewRow();
List<PropertyInfo> propertyList = classType.GetProperties().ToList();
foreach (PropertyInfo prop in propertyList)
{
if (Table.Columns.Contains(prop.Name))
{
if (Table.Columns[prop.Name] != null)
{
row[prop.Name] = prop.GetValue(Data, null);
}
}
}
Table.Rows.Add(row);
}
public static bool IsNullable(Type Input)
{
if (!Input.IsValueType) return true; // Is a ref-type, such as a class
if (Nullable.GetUnderlyingType(Input) != null) return true; // Nullable
return false; // Must be a value-type
}
Here is an example of how to serialize a DataTable to XML, and load it back again
string filePath = "order1.xml";
string schemaPath = Path.ChangeExtension(filePath,".xsd");
ordersTable.WriteXml(filePath);
ordersTable.WriteXmlSchema(schemaPath);
// Load
DataTable loadedTable = new DataTable();
loadedTable.ReadXmlSchema(schemaPath);
loadedTable.ReadXml(filePath);
// Set DataGrid's DataSource
dataGrid1.DataSource = dataTable;
The full project and source code for EntityJustWorks can be found on GitHub and CodePlex.
Labels:
.net,
C#,
Cool,
csharp,
CSV,
Data,
Database,
DataBind,
DataTable,
Mapping Class,
Object Mapping,
Object Relational Mapping,
ORM,
PropertyInfo,
Reflection,
Serialization,
spreadsheet,
SQL,
Table,
XML
Tuesday, July 9, 2013
Procedural generation of cave-like maps for rogue-like games
This post is about procedural content generation of cave-like dungeons/maps for rogue-like games using what is known as the Cellular Automata method.
To understand what I mean by cellular automata method, imagine Conway's Game of Life. Many algorithms use what is called the '4-5 method', which means a tile will become a wall if it is a wall and 4 or more of its nine neighbors are walls, or if it is not a wall and 5 or more neighbors are walls. I start by filling the map randomly with walls or space, then visit each x/y position iteratively and apply the 4-5 rule. Usually this is preceded with 'seeding' the map by randomly filling each cell of the map with a wall or space, based on some weight (say, 40% of the time it chooses to place a wall). Then the automata step is applied multiple times over the entire map, precipitating walls and subsequently smoothing them. About 3 rounds is all that is required, with about 4-5 rounds being pretty typical amongst most implementations. Perhaps a picture of the the output will help you understand what I mean.
Using the automata-based method for procedural generation of levels will produce something similar to this:
Sure the dungeon generation by linking rooms approach has its place, but I really like the 'natural' look to the automata inspired method. I first originally discovered this technique on the website called Roguebasin. It is a great resource for information concerning the different problems involved with programming a rogue-like game, such as Nethack or Angband.
One of the major problems developers run into while employing this technique is the formation of isolated caves. Instead of one big cave-like room, you may get section of the map that is inaccessible without digging through walls. Isolated caves can trap key items or (worse) stairs leading to the next level, preventing further progress in the game. I have seen many different approaches proposed to solve this problem. Some suggestions I've seen include: 1) Discarding maps that have isolated caves, filling in the isolated sections, or finely tweaking the variables/rules to reduce occurrences of such maps. None of these are ideal (in my mind), and most require a way to detect isolated sections, which is another non-trivial problem in itself.
Despite this, I consider the problem solved because I have discovered a solution that is dead simple and almost** never fail because of the rules of the automata generation itself dictate such. I call my method 'Horizontal Blanking' because you can probably guess how it works now just from hearing the name. This step comes after the random filling of the map (initialization), but before the cellular automata iterations. After the map is 'seeded' with a random fill of walls, a horizontal strip in the middle of of the map is cleared of all walls. The horizontal strip is about 3 or 4 block tall (depending on rules). Clearing a horizontal strip of sufficient width will prevent a continuous vertical wall from being created and forming isolated caves in your maps. After horizontal blanking, you can begin applying the cellular automata method to your map.
** I say 'almost' because although it it not possible to get whole rooms that are disconnected from each other, it is possible to get tiny squares of blank space in the northern or southern walls that consist of about 4-5 blocks in total area. Often, these little holes will resolve themselves during the rounds of automata rules, but there still exists the possibility that one may persist. My answer to this edge case would be to use some rules around the placement of stairwells (and other must-find items) dictating that such objects must have a 2-3 block radius clear of walls to be placed.
See below for the code that produced the above screenshot, or click here to download the entire project with source code that you can compile yourself.
public class MapHandler
{
Random rand = new Random();
public int[,] Map;
public int MapWidth { get; set; }
public int MapHeight { get; set; }
public int PercentAreWalls { get; set; }
public MapHandler()
{
MapWidth = 40;
MapHeight = 21;
PercentAreWalls = 40;
RandomFillMap();
}
public void MakeCaverns()
{
// By initilizing column in the outter loop, its only created ONCE
for(int column=0, row=0; row <= MapHeight-1; row++)
{
for(column = 0; column <= MapWidth-1; column++)
{
Map[column,row] = PlaceWallLogic(column,row);
}
}
}
public int PlaceWallLogic(int x,int y)
{
int numWalls = GetAdjacentWalls(x,y,1,1);
if(Map[x,y]==1)
{
if( numWalls >= 4 )
{
return 1;
}
return 0;
}
else
{
if(numWalls>=5)
{
return 1;
}
}
return 0;
}
public int GetAdjacentWalls(int x,int y,int scopeX,int scopeY)
{
int startX = x - scopeX;
int startY = y - scopeY;
int endX = x + scopeX;
int endY = y + scopeY;
int iX = startX;
int iY = startY;
int wallCounter = 0;
for(iY = startY; iY <= endY; iY++) {
for(iX = startX; iX <= endX; iX++)
{
if(!(iX==x && iY==y))
{
if(IsWall(iX,iY))
{
wallCounter += 1;
}
}
}
}
return wallCounter;
}
bool IsWall(int x,int y)
{
// Consider out-of-bound a wall
if( IsOutOfBounds(x,y) )
{
return true;
}
if( Map[x,y]==1 )
{
return true;
}
if( Map[x,y]==0 )
{
return false;
}
return false;
}
bool IsOutOfBounds(int x, int y)
{
if( x<0 data-blogger-escaped-else="" data-blogger-escaped-if="" data-blogger-escaped-return="" data-blogger-escaped-true="" data-blogger-escaped-x="" data-blogger-escaped-y="">MapWidth-1 || y>MapHeight-1 )
{
return true;
}
return false;
}
Above is the main core of the logic.Here is the rest of the program, such as filling, printing and blanking:
public void PrintMap()
{
Console.Clear();
Console.Write(MapToString());
}
string MapToString()
{
string returnString = string.Join(" ", // Seperator between each element
"Width:",
MapWidth.ToString(),
"\tHeight:",
MapHeight.ToString(),
"\t% Walls:",
PercentAreWalls.ToString(),
Environment.NewLine
);
List<string> mapSymbols = new List();
mapSymbols.Add(".");
mapSymbols.Add("#");
mapSymbols.Add("+");
for(int column=0,row=0; row < MapHeight; row++ ) {
for( column = 0; column < MapWidth; column++ )
{
returnString += mapSymbols[Map[column,row]];
}
returnString += Environment.NewLine;
}
return returnString;
}
public void BlankMap()
{
for(int column=0,row=0; row < MapHeight; row++) {
for(column = 0; column < MapWidth; column++) {
Map[column,row] = 0;
}
}
}
public void RandomFillMap()
{
// New, empty map
Map = new int[MapWidth,MapHeight];
int mapMiddle = 0; // Temp variable
for(int column=0,row=0; row < MapHeight; row++) {
for(column = 0; column < MapWidth; column++)
{
// If coordinants lie on the edge of the map
// (creates a border)
if(column == 0)
{
Map[column,row] = 1;
}
else if (row == 0)
{
Map[column,row] = 1;
}
else if (column == MapWidth-1)
{
Map[column,row] = 1;
}
else if (row == MapHeight-1)
{
Map[column,row] = 1;
}
// Else, fill with a wall a random percent of the time
else
{
mapMiddle = (MapHeight / 2);
if(row == mapMiddle)
{
Map[column,row] = 0;
}
else
{
Map[column,row] = RandomPercent(PercentAreWalls);
}
}
}
}
}
int RandomPercent(int percent)
{
if(percent>=rand.Next(1,101))
{
return 1;
}
return 0;
}
public MapHandler(int mapWidth, int mapHeight, int[,] map, int percentWalls=40)
{
this.MapWidth = mapWidth;
this.MapHeight = mapHeight;
this.PercentAreWalls = percentWalls;
this.Map = new int[this.MapWidth,this.MapHeight];
this.Map = map;
}
}
And of course, the main function:
public static void Main(string[] args)
{
char key = new Char();
MapHandler Map = new MapHandler();
string instructions =
"[Q]uit [N]ew [+][-]Percent walls [R]andom [B]lank" + Environment.NewLine +
"Press any other key to smooth/step";
Map.MakeCaverns();
Map.PrintMap();
Console.WriteLine(instructions);
key = Char.ToUpper(Console.ReadKey(true).KeyChar);
while(!key.Equals('Q'))
{
if(key.Equals('+')) {
Map.PercentAreWalls+=1;
Map.RandomFillMap();
Map.MakeCaverns();
Map.PrintMap();
} else if(key.Equals('-')) {
Map.PercentAreWalls-=1;
Map.RandomFillMap();
Map.MakeCaverns();
Map.PrintMap();
} else if(key.Equals('R')) {
Map.RandomFillMap();
Map.PrintMap();
} else if(key.Equals('N')) {
Map.RandomFillMap();
Map.MakeCaverns();
Map.PrintMap();
} else if(key.Equals('B')) {
Map.BlankMap();
Map.PrintMap();
} else if(key.Equals('D')) {
// I set a breakpoint here...
} else {
Map.MakeCaverns();
Map.PrintMap();
}
Console.WriteLine(instructions);
key = Char.ToUpper(Console.ReadKey(true).KeyChar);
}
Console.Clear();
Console.Write(" Thank you for playing!");
Console.ReadKey(true);
}
See also: Roguebasin - Cellular Automata Method for Generating Random Cave-Like Levels
Labels:
.net,
Algorithm,
C#,
Cellular automata,
Class,
Cool,
csharp,
Games,
hack,
Math,
Procedural content generation,
Random,
Rogue-like,
RPG,
Statistics
Thursday, June 27, 2013
XML Serializable Dictionary, Tuple, and Object
Serializing a data class to an XML file using XmlSerializer is very useful. However, some of the most useful data classes in .NET are not serializable. Dictionary and Tuple most notably. If you looking for a blog post on how to make a Dictionary that accepts duplicate keys by storing the values values with identical keys in a List, please see this blog post.
The below SerializableDictionary class works by inheriting from IXmlSerializable, which requires you implement the following three methods:
* GetSchema() - Remember, you should always return null for this function.
* ReadXml(XmlReader reader)
* WriteXml(XmlWriter writer)
(Read about IXmlSerializable on MSDN)
Here is the code to serialize a dictionary or serialize a tuple:
namespace XMLSerializableDictionary
{
using System;
using System.Collections.Generic;
using System.Runtime.Serialization;
using System.Xml;
using System.Xml.Schema;
using System.Xml.Serialization;
[Serializable]
[XmlRoot("Dictionary")]
public class SerializableDictionary<TKey, TValue>
: Dictionary<TKey, TValue>, IXmlSerializable
{
private const string DefaultTagItem = "Item";
private const string DefaultTagKey = "Key";
private const string DefaultTagValue = "Value";
private static readonly XmlSerializer KeySerializer =
new XmlSerializer(typeof(TKey));
private static readonly XmlSerializer ValueSerializer =
new XmlSerializer(typeof(TValue));
public SerializableDictionary() : base()
{
}
protected SerializableDictionary(SerializationInfo info, StreamingContext context)
: base(info, context)
{
}
protected virtual string ItemTagName
{
get { return DefaultTagItem; }
}
protected virtual string KeyTagName
{
get { return DefaultTagKey; }
}
protected virtual string ValueTagName
{
get { return DefaultTagValue; }
}
public XmlSchema GetSchema()
{
return null;
}
public void ReadXml(XmlReader reader)
{
bool wasEmpty = reader.IsEmptyElement;
reader.Read();
if (wasEmpty)
{
return;
}
try
{
while (reader.NodeType != XmlNodeType.EndElement)
{
reader.ReadStartElement(this.ItemTagName);
try
{
TKey tKey;
TValue tValue;
reader.ReadStartElement(this.KeyTagName);
try
{
tKey = (TKey)KeySerializer.Deserialize(reader);
}
finally
{
reader.ReadEndElement();
}
reader.ReadStartElement(this.ValueTagName);
try
{
tValue = (TValue)ValueSerializer.Deserialize(reader);
}
finally
{
reader.ReadEndElement();
}
this.Add(tKey, tValue);
}
finally
{
reader.ReadEndElement();
}
reader.MoveToContent();
}
}
finally
{
reader.ReadEndElement();
}
}
public void WriteXml(XmlWriter writer)
{
foreach (KeyValuePair<TKey, TValue> keyValuePair in this)
{
writer.WriteStartElement(this.ItemTagName);
try
{
writer.WriteStartElement(this.KeyTagName);
try
{
KeySerializer.Serialize(writer, keyValuePair.Key);
}
finally
{
writer.WriteEndElement();
}
writer.WriteStartElement(this.ValueTagName);
try
{
ValueSerializer.Serialize(writer, keyValuePair.Value);
}
finally
{
writer.WriteEndElement();
}
}
finally
{
writer.WriteEndElement();
}
}
}
}
}
The idea behind the serializable tuple is we just make our own Tuple that stores the items by declaring the properties to represent them with their generic T type. If you are not used to working with generics, this can be a little strange. T1, T2 and T3 are just placeholders for the type that is to be determined by the calling function, or the function above that if the calling function uses generics too.
And a serializable tuple:
public class SerializableTuple<T1,T2,T3>
{
public T1 Item1 { get; set; }
public T2 Item2 { get; set; }
public T3 Item3 { get; set; }
public static implicit operator Tuple<T1,T2,T3>(SerializableTuple<T1,T2,T3> st)
{
return Tuple.Create(st.Item1,st.Item2,st.Item3);
}
public static implicit operator SerializableTuple<T1,T2,T3>(Tuple<T1,T2,T3> t)
{
return new SerializableTuple<T1,T2,T3>() {
Item1 = t.Item1,
Item2 = t.Item2,
Item3 = t.Item3
};
}
public SerializableTuple()
{
}
}
And finally, a generic object serializer and deserializer:
public static class XML
{
public static class Serialize
{
public static void Object(string Filename, object obj)
{
using (StreamWriter streamWriter = new StreamWriter(Filename))
{
XmlSerializer xmlSerializer = new XmlSerializer(obj.GetType());
xmlSerializer.Serialize(streamWriter, obj);
}
}
}
public static class DeSerialize
{
public static string Generic<T>(T data)
{
if (data == null)
return string.Empty;
string content = string.Empty;
using (MemoryStream memoryStream = new MemoryStream())
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
serializer.Serialize(memoryStream, data);
memoryStream.Seek(0, SeekOrigin.Begin);
using (StreamReader reader = new StreamReader(memoryStream))
{
content = reader.ReadToEnd();
}
}
return content;
}
public static object Object(string Filename, Type type)
{
object result = null;
using (TextReader reader = new StringReader(Filename))
{
XmlSerializer serializer = new XmlSerializer(type);
result = serializer.Deserialize(reader);
}
return result;
}
}
}
And perhaps after you serialize your data to an XML file, you would like to generate a schema XML file from it:
void XmlToSchema(string FileName)
{
XmlReader xmlReader = XmlReader.Create(FileName);
XmlSchemaSet schemaSet = new XmlSchemaSet();
XmlSchemaInference schemaInfer = new XmlSchemaInference();
schemaSet = schemaInfer.InferSchema(xmlReader);
string outFilename = Path.ChangeExtension(FileName,".xsd");
using(Stream streamOut = new FileStream(outFilename,FileMode.Create) )
{
TextWriter textWriter = new StreamWriter(streamOut);
foreach (XmlSchema s in schemaSet.Schemas())
{
s.Write(textWriter );
}
textWriter .Close();
}
}
Labels:
.net,
C#,
Class,
Cool,
csharp,
Data,
Data Structure,
Dictionary,
List,
Object,
Schema,
Serialization,
Table,
Tuple,
XML
PeriodicTable Element Class
Periodic Table as an Array of class Element in C#
public class Element
{
public int AtomicNumber { get; set; }
public string Symbol { get; set; }
public string Name { get; set; }
public decimal AtomicWeight { get; set; }
// public string GroupNumber { get; set; }
// public string GroupName { get; set; }
// public string Period { get; set; }
// public string Block { get; set; }
// public string CASRegistryID { get; set; }
// public string DiscoveryDate { get; set; }
// public string DiscovererName { get; set; }
public Element() { }
public Element(int atomicNumber,string symbol,string name,decimal atomicWeight)
{
AtomicNumber = atomicNumber;
Symbol = symbol;
Name = name;
AtomicWeight = atomicWeight;
}
}
public class PeriodicTable
{
public List<element> Elements;
public PeriodicTable()
{
Elements = new List();
Elements.Add(new Element(1, "H", "Hydrogen", 1.007825M ));
Elements.Add(new Element(2, "He", "Helium", 4.00260M ));
Elements.Add(new Element(3, "Li", "Lithium", 6.941M ));
Elements.Add(new Element(4, "Be", "Beryllium", 9.01218M ));
Elements.Add(new Element(5, "B", "Boron", 10.81M ));
Elements.Add(new Element(6, "C", "Carbon", 12.011M ));
Elements.Add(new Element(7, "N", "Nitrogen", 14.0067M ));
Elements.Add(new Element(8, "O", "Oxygen", 15.999M ));
Elements.Add(new Element(9, "F", "Fluorine", 18.99840M ));
Elements.Add(new Element(10, "Ne", "Neon", 20.179M ));
Elements.Add(new Element(11, "Na", "Sodium", 22.98977M ));
Elements.Add(new Element(12, "Mg", "Magnesium", 24.305M ));
Elements.Add(new Element(13, "Al", "Aluminum", 26.98154M ));
Elements.Add(new Element(14, "Si", "Silicon", 28.0855M ));
Elements.Add(new Element(15, "P", "Phosphorus", 0.0M ));
Elements.Add(new Element(16, "S", "Sulphur", 32.06M ));
Elements.Add(new Element(17, "Cl", "Chlorine", 35.453M ));
Elements.Add(new Element(18, "Ar", "Argon", 39.948M ));
Elements.Add(new Element(19, "K", "Potassium", 39.0983M ));
Elements.Add(new Element(20, "Ca", "Calcium", 40.08M ));
Elements.Add(new Element(21, "Sc", "Scandium", 44.9559M ));
Elements.Add(new Element(22, "Ti", "Titanium", 47.90M ));
Elements.Add(new Element(23, "V", "Vanadium", 50.9414M ));
Elements.Add(new Element(24, "Cr", "Chromium", 51.996M ));
Elements.Add(new Element(25, "Mn", "Manganese", 54.9380M ));
Elements.Add(new Element(26, "Fe", "Iron", 55.85M ));
Elements.Add(new Element(27, "Co", "Cobalt", 58.9332M ));
Elements.Add(new Element(28, "Ni", "Nickel", 58.71M ));
Elements.Add(new Element(29, "Cu", "Copper", 63.546M ));
Elements.Add(new Element(30, "Zn", "Zinc", 65.37M ));
Elements.Add(new Element(31, "Ga", "Gallium", 69.72M ));
Elements.Add(new Element(32, "Ge", "Germanium", 72.59M ));
Elements.Add(new Element(33, "As", "Arsenic", 74.9216M ));
Elements.Add(new Element(34, "Se", "Selenium", 78.96M ));
Elements.Add(new Element(35, "Br", "Bromine", 79.904M ));
Elements.Add(new Element(36, "Kr", "Krypton", 83.80M ));
Elements.Add(new Element(37, "Rb", "Rubidium", 85.4678M ));
Elements.Add(new Element(38, "Sr", "Strontium", 87.62M ));
Elements.Add(new Element(39, "Y", "Yttrium", 88.9059M ));
Elements.Add(new Element(40, "Zr", "Zirconium", 91.22M ));
Elements.Add(new Element(41, "Nb", "Niobium", 92.91M ));
Elements.Add(new Element(42, "Mo", "Molybdenum", 95.94M ));
Elements.Add(new Element(43, "Tc", "Technetium", 99.0M ));
Elements.Add(new Element(44, "Ru", "Ruthenium", 101.1M ));
Elements.Add(new Element(45, "Rh", "Rhodium", 102.91M ));
Elements.Add(new Element(46, "Pd", "Palladium", 106.42M ));
Elements.Add(new Element(47, "Ag", "Silver", 107.87M ));
Elements.Add(new Element(48, "Cd", "Cadmium", 112.4M ));
Elements.Add(new Element(49, "In", "Indium", 114.82M ));
Elements.Add(new Element(50, "Sn", "Tin", 118.69M ));
Elements.Add(new Element(51, "Sb", "Antimony", 121.75M ));
Elements.Add(new Element(52, "Te", "Tellurium", 127.6M ));
Elements.Add(new Element(53, "I", "Iodine", 126.9045M ));
Elements.Add(new Element(54, "Xe", "Xenon", 131.29M ));
Elements.Add(new Element(55, "Cs", "Cesium", 132.9054M ));
Elements.Add(new Element(56, "Ba", "Barium", 137.33M ));
Elements.Add(new Element(57, "La", "Lanthanum", 138.91M ));
Elements.Add(new Element(58, "Ce", "Cerium", 140.12M ));
Elements.Add(new Element(59, "Pr", "Praseodymium", 140.91M ));
Elements.Add(new Element(60, "Nd", "Neodymium", 0.0M ));
Elements.Add(new Element(61, "Pm", "Promethium", 147.0M ));
Elements.Add(new Element(62, "Sm", "Samarium", 150.35M ));
Elements.Add(new Element(63, "Eu", "Europium", 167.26M ));
Elements.Add(new Element(64, "Gd", "Gadolinium", 157.25M ));
Elements.Add(new Element(65, "Tb", "Terbium", 158.925M ));
Elements.Add(new Element(66, "Dy", "Dysprosium", 162.50M ));
Elements.Add(new Element(67, "Ho", "Holmium", 164.9M ));
Elements.Add(new Element(68, "Er", "Erbium", 167.26M ));
Elements.Add(new Element(69, "Tm", "Thulium", 168.93M ));
Elements.Add(new Element(70, "Yb", "Ytterbium", 173.04M ));
Elements.Add(new Element(71, "Lu", "Lutetium", 174.97M ));
Elements.Add(new Element(72, "Hf", "Hafnium", 178.49M ));
Elements.Add(new Element(73, "Ta", "Tantalum", 180.95M ));
Elements.Add(new Element(74, "W", "Tungsten", 183.85M ));
Elements.Add(new Element(75, "Re", "Rhenium", 186.23M ));
Elements.Add(new Element(76, "Os", "Osmium", 190.2M ));
Elements.Add(new Element(77, "Ir", "Iridium", 192.2M ));
Elements.Add(new Element(78, "Pt", "Platinum", 195.09M ));
Elements.Add(new Element(79, "Au", "Gold", 196.9655M ));
Elements.Add(new Element(80, "Hg", "Mercury", 200.59M ));
Elements.Add(new Element(81, "Tl", "Thallium", 204.383M ));
Elements.Add(new Element(82, "Pb", "Lead", 207.2M ));
Elements.Add(new Element(83, "Bi", "Bismuth", 208.9804M ));
Elements.Add(new Element(84, "Po", "Polonium", 210.0M ));
Elements.Add(new Element(85, "At", "Astatine", 210.0M ));
Elements.Add(new Element(86, "Rn", "Radon", 222.0M ));
Elements.Add(new Element(87, "Fr", "Francium", 233.0M ));
Elements.Add(new Element(88, "Ra", "Radium", 226.0254M ));
Elements.Add(new Element(89, "Ac", "Actinium", 227.0M ));
Elements.Add(new Element(90, "Th", "Thorium", 232.04M ));
Elements.Add(new Element(91, "Pa", "Protactinium", 231.0359M ));
Elements.Add(new Element(92, "U", "Uranium", 238.03M ));
Elements.Add(new Element(93, "Np", "Neptunium", 237.0M ));
Elements.Add(new Element(94, "Pu", "Plutonium", 244.0M ));
Elements.Add(new Element(95, "Am", "Americium", 243.0M ));
Elements.Add(new Element(96, "Cm", "Curium", 247.0M ));
Elements.Add(new Element(97, "Bk", "Berkelium", 247.0M ));
Elements.Add(new Element(98, "Cf", "Californium", 251.0M ));
Elements.Add(new Element(99, "Es", "Einsteinium", 254.0M ));
Elements.Add(new Element(100, "Fm", "Fermium", 257.0M ));
Elements.Add(new Element(101, "Md", "Mendelevium", 258.0M ));
Elements.Add(new Element(102, "No", "Nobelium", 259.0M ));
Elements.Add(new Element(103, "Lr", "Lawrencium", 262.0M ));
Elements.Add(new Element(104, "Rf", "Rutherfordium",260.9M ));
Elements.Add(new Element(105, "Db", "Dubnium", 261.9M ));
Elements.Add(new Element(106, "Sg", "Seaborgium", 262.94M ));
Elements.Add(new Element(107, "Bh", "Bohrium", 262.0M ));
Elements.Add(new Element(108, "Hs", "Hassium", 264.8M ));
Elements.Add(new Element(109, "Mt", "Meitnerium", 265.9M ));
Elements.Add(new Element(110, "Ds", "Darmstadtium", 261.9M ));
Elements.Add(new Element(112, "Uub", "Ununbium", 276.8M ));
Elements.Add(new Element(114, "Uuq", "Ununquadium", 289.0M ));
Elements.Add(new Element(116, "Uuh", "Ununhexium", 0.0M ));
}
}
Improvements could include implementing a the PeriodicTable as a Dictionary, such as a
Dictionary<string,Element>
. This adds the ability to to retrieve Element information given only its symbol or atomic number. Use SortedDictionary<string,Element>
to order the elements by symbol, name, atomic number or weight (the dictionary will be sorted off of the key).Admittedly, this still isn't a very useful class, its just for fun.
Fake/Random Identity Generator
Inspiration
During my research on RSA cryptography and the importance of a truly random number for having a large key-space, I stumbled on to FakeNameGenerator.com. I thought the concept could be really useful for certain applications and could easily envision how to implement it in C#, and make it extensible/customizable.
Take a look at these:
<?xml version="1.0" standalone="yes"?>
<DocumentElement>
<Order>
<Date>3/18/2005</Date>
<TrackingNumber>1Z 8A8 238 01 9398 182 1</TrackingNumber>
<FirstName>Keaton </FirstName>
<LastName>Day</LastName>
<StreetAddress>4828 Cherry St.</StreetAddress>
<City>Nanticoke</City>
<State>SC</State>
<Zip>89130</Zip>
<Email>HaleHale8026@mail.com</Email>
<Phone>425-765-4520</Phone>
</Order>
<Payroll>
<PhoneNumber>971-258-5703</PhoneNumber>
<AltPhoneNumber>501-769-1331</AltPhoneNumber>
<FirstName>Xyla </FirstName>
<LastName>Hoover</LastName>
<EmployeeID>499</EmployeeID>
<HireDate>5/28/2011</HireDate>
<Birthdate>5/28/1990</Birthdate>
<SSN>520-52-4275</SSN>
<AccountNumber>5696618825</AccountNumber>
<RoutingNumber>575159859</RoutingNumber>
<Address>8348 Court Ave.</Address>
<City>Pittsburgh,</City>
<State>PA.</State>
<Zip>15201</Zip>
</Payroll>
CREATE TABLE ReservationData (
`id` mediumint(8) unsigned NOT NULL auto_increment,
`UniqueID` MEDIUMINT default NULL,
`TripDate` varchar(50) default NULL,
`FirstName` varchar(255) default NULL,
`LastName` varchar(255) default NULL,
`Phone` varchar(100) default NULL,
`AltPhone` varchar(100) default NULL,
`Email` varchar(255) default NULL,
`StreetAddress` varchar(255) default NULL,
`City` varchar(50) default NULL,
`State` varchar(50) default NULL,
`Zip` varchar(10) default NULL,
`Country` varchar(255) default NULL,
`DayOfYear` varchar(50) default NULL,
`TotalCost` varchar(50) default NULL,
`Balance` varchar(10) default NULL,
`CCard` varchar(18) default NULL,
`Expires` varchar(5) default NULL,
`CVC2` varchar(3) default NULL,
PRIMARY KEY (`id`)
) TYPE=MyISAM AUTO_INCREMENT=1;
This would make great honey for a honey pot; just fill an SQL database with this random, realistic looking data, serve and log any and all attempts to access, query or dump the database. This can be done on a VM and you have a easily deployed, high interaction honeypot!
Aside from being able to see their IP address, I think the most useful data that can be attained is their behavior; what injection attacks are they using to drop the database? Write rules to prevent your honey against trivial attempts such as the ' AND 1=(SELECT attacks and see what they come up with next. Rule writing is inherently a cat-and-mouse game, honeypots like this clearly give the white-hats the upper hand.
Implementation
A quick, fast and dirty solution is to simply read a random line from a text file (i.e. Name_First.txt and Address_Street.txt).
This way, you can choose from names that are common, or customize your list to for different nationalities.
One could read the whole file in to a string, Parse() it into an array of strings, then randomly select an index, but this would be unacceptable for very large files. Instead, we can set the file pointer to a random position that is less than its size, roll back to the last new line and call ReadLine.
public string ReturnRandomLine(string FileName)
{
string sReturn = string.Empty;
using(FileStream myFile = new FileStream(FileName,FileMode.Open,FileAccess.Read))
{
using(StreamReader myStream = new StreamReader(myFile))
{
// Seek file stream pointer to a rand position...
myStream.BaseStream.Seek(rand.Next(1,myFile.Length),SeekOrigin.Begin);
// Read the rest of that line.
myStream.ReadLine();
// Return the next, full line...
sReturn = myStream.ReadLine();
}
}
// If our random file position was too close to the end of the file, it will return an empty string
// I avoided a while loop in the case that the file is empty or contains only one line
if(System.String.IsNullOrWhiteSpace(sReturn)) {
sReturn = ReturnRandomLine(FileName);
}
return sReturn;
}
Example use:
public string GenerateFistName()
{
return ReturnRandomLine("Name_First.txt") + " ";
}
public string GenerateLastName()
{
return ReturnRandomLine("Name_Last.txt");
}
public string GenerateFullName()
{
return GenerateFistName() + GenerateLastName();
}
public string GenerateGender()
{
if(ReturnPercent(84)) {
return "Male";
} else {
return "Female";
}
}
public string GenerateStreetNumber()
{
return rand.Next(1,9999).ToString();
}
public string GenerateStreetName()
{
return ReturnRandomLine("Address_Street.txt");
}
One limitation is where the data is relational, such as in the case of generating a random zip code along with the city and state that it exists in.
A quick work-around would be CityZipState.txt
Other types of data that can be generated that would not make sense to put in a text file:
public bool ReturnPercent(int Percent) // Return true Percent times out of 100, randomly
{
int iTemp = rand.Next(1,101);
if(iTemp<=Percent) {
return true;
} else {
return false;
}
}
public string GenerateDate(int YearFrom,int YearTo)
{
int Month = rand.Next(1,13);
int Day = rand.Next(1,32);
int Year = GenerateYear(YearFrom,YearTo);
return Month.ToString() + "/" + Day.ToString() + "/" + Year.ToString();
}
public string GenerateYear(int YearFrom,int YearTo)
{
return rand.Next(YearFrom,YearTo+1).ToString();
}
public string GeneratePhoneNumber()
{
return GeneratePhoneNumber(ReturnRandomLine("PhoneNumber_Prefix.txt"));
}
public string GeneratePhoneNumber(string Prefix)
{
int iThree = rand.Next(192,999);
int iFour = rand.Next(1000,9999);
return Prefix + iThree.ToString() + "-" + iFour.ToString();
}
public string GenerateSSN()
{
int iThree = rand.Next(132,921);
int iTwo = rand.Next(12,83);
int iFour = rand.Next(1423,9211);
return iThree.ToString() + "-" + iTwo.ToString() + "-" + iFour.ToString();
}
Obviously, these methods can be improved to conform to the standards of a real social security number, national identification number, credit card number, ect...
public string GenerateCCNum()
{
string sCCNum = string.Empty;
byte[] bCCNum = {0};
rand.NextBytes(bCCNum);
// generate random 16 digit number
int iTemp1 = rand.Next(10000000,99999999);
int iTemp2 = rand.Next(10000000,99999999);
string sTemp = iTemp1.ToString() + iTemp2.ToString();
// while loop?
while(!IsValidNumber(sTemp))
{
iTemp1 = rand.Next(10000000,99999999);
iTemp2 = rand.Next(10000000,99999999);
sTemp = iTemp1.ToString() + iTemp2.ToString();
}
sCCNum = sTemp;
return sCCNum;
}
The implementation of IsValidNumber() is left as an exercise for the reader.
The serialization of your data is a trivial matter. Please see my post on a XML Serializable Dictionary, Tuple, and Object for the code to serialize an object (such as a list, or a class).
Sunday, June 23, 2013
Gracefull error handling with a global exception handler
Every published C# application should have graceful error handling. Here I show you the implementation of a global exception handler using ThreadExceptionEventHandler.
First, you have to add System.Threading to both your Program.cs and Mainform.cs:
// Program.cs and Mainform.cs
using System.Threading;
Then add an event handler to Application.ThreadException:
// Program.cs
// static class Program {
// private static void Main(string[] args) {
Application.ThreadException += new ThreadExceptionEventHandler(MainForm.MyExceptionHandler);
// Application.Run(new MainForm());
Or, if you are writing a console app, add an event handler to AppDomain.UnhandledException:
AppDomain.CurrentDomain.UnhandledException += new UnhandledExceptionEventHandler(MyExceptionHandler);
Then add the exception handler body:
// Mainform.cs
public static void MyExceptionHandler(object sender, ThreadExceptionEventArgs e)
{
MessageBox.Show(e.Exception.Message,"Error",MessageBoxButtons.OK,MessageBoxIcon.Error);
}
The example here simply shows a message box, but an even more graceful approach would be to log all the unhanded exceptions to a log file. That way, the errors are transparent to the user, but the developer still has access to detailed debug information.
Here is the full code:
Program.cs
MainForm.cs
Labels:
.net,
Architecture,
Best practices,
C#,
csharp,
Error handling,
Exception,
security,
snippets,
UnhandledExceptionHandler,
User Experience,
UX
Subscribe to:
Posts (Atom)