Yesterday I found annoying flaw in Rhino mocks. I expected that when you said Repeat.Twice() the expectation is that the method would be executed exactly 2 times and if no - expectation should be failed. But looks like that it saying "at least twice". (Despite the fact that there is as well Repeat.AtLeastOnce())
So here is my setup:
public class Foo
{
public virtual void DoStuff()
{
Console.WriteLine("DoStuff");
}
}
public class Bar
{
public Foo foo { get; set; }
public void CallFoo(int max)
{
for (int i = 0; i < max; i++)
{
foo.DoStuff();
}
}
}
And that test is not failing unfortunately. You can even replace Times(2) with anything else - just the number of executions should be less then the actual number - test still would be green. And you can replace it with Once() or AtLeastOnce() - no difference at all!
[Test]
public void MockTest()
{
var mock = MockRepository.GenerateMock();
mock.Expect(x => x.DoStuff())
.Repeat.Times(2);
var bar = new Bar();
bar.foo = mock;
bar.CallFoo(3);
mock.VerifyAllExpectations();
}
And here is how you can write that test to fail. Finally, checking for exact number of calls!
[Test]
public void MockTest2()
{
var mock = MockRepository.GenerateMock();
var bar = new Bar();
bar.foo = mock;
bar.CallFoo(3);
mock.AssertWasCalled(x => x.DoStuff(),
y => y.Repeat.Times(2));
}
I`ve promised to write about my implementation of the async file uploader for the web forms application. The task was to implement it as a server control that can be added and reused on different pages (actually, in order to replace the old teleric control).
I decided to use this plugin. And the only problem was with a url to upload the file. As it should be available from the different pages I could not use the [WebMethod] so I`ve implemented a http handler to upload the file. Plus few additional changes to the client side - show the amount of files, add an option to delete uploaded file etc.
Here is my implementation:
public class AsyncUploader : CompositeControl
{
private Panel _container;
private Panel _uploadedFilesContainer;
private HtmlInputFile _uploader;
public string[] AllowedFileExtensions { get; set; }
///
/// Maximum file size in bytes
///
public int MaxFileSize { get; set; }
///
/// Gets or sets the virtual path of the folder, where RadUpload will automatically save the valid files after the upload completes.
/// Note that existing files with the same name will be overwritten. As such it is best to append a unique identifier to the folder.
///
public string TargetPhysicalFolder { get; set; }
public int MaxFilesCount { get; set; }
protected string EscapedUniqueId { get { return Regex.Replace(UniqueID, "[$.\\s]", "_"); } }
public List Files
{
get
{
EnsureChildControls();
var targetFolder = TargetPhysicalFolder;
var result = new List();
foreach (var key in Page.Request.Form.AllKeys)
{
if (key.StartsWith(EscapedUniqueId + "savedFile_"))
{
var index = key.Replace(EscapedUniqueId + "savedFile_", "");
var savedFileName = Page.Request.Form[key];
result.Add(new UploadedFile
{
OriginalFileName = Page.Request.Form[EscapedUniqueId + "originalFile_" + index],
SavedFileFullPath = Path.Combine(targetFolder, savedFileName),
SavedFileName = savedFileName
});
}
}
return result;
}
}
protected override void OnPreRender(EventArgs e)
{
if (_container.ClientIDMode != ClientIDMode.Static) throw new ArgumentException("ClientIDMode");
var uniqueId = EscapedUniqueId;
var options = new JavaScriptSerializer().Serialize(new
{
uniqueID = uniqueId,
elementSelector = "#" + uniqueId,
sessionId = Page.Session.SessionID,
maxFiles = MaxFilesCount,
maxFileSize = MaxFileSize,
extensions = AllowedFileExtensions != null && AllowedFileExtensions.Any()
? AllowedFileExtensions.ToDelimitedString("|")
: "*",
submitButtonsSelector = _submitButtonsToDisable
.Select(x => "#" + x.ClientID)
.ToDelimitedString(", ")
});
Page.Session[uniqueId + "_TargetFolder"] = TargetPhysicalFolder;
var setUp = "".F(
options);
Page.ClientScript.RegisterClientScriptBlock(GetType(), "setup" + uniqueId, setUp);
base.OnPreRender(e);
}
protected override void CreateChildControls()
{
_container = new Panel();
_container.ClientIDMode = ClientIDMode.Static;
_container.ID = EscapedUniqueId;
_uploader = new HtmlInputFile();
_uploader.Attributes.Add("class", "filePicker");
if (MaxFilesCount > 1)
_uploader.Attributes.Add("multiple", "true");
_uploadedFilesContainer = new Panel();
_uploadedFilesContainer.Attributes.Add("class", "uploadedFiles");
_uploadedFilesContainer.ID = RandomString.Generate(20);
var errorsContainer = new Panel();
errorsContainer.Attributes.Add("class", "uploadErrors");
_container.Controls.Add(_uploader);
_container.Controls.Add(_uploadedFilesContainer);
_container.Controls.Add(errorsContainer);
this.Controls.Add(_container);
base.CreateChildControls();
}
protected override void OnLoad(EventArgs e)
{
//Register resources (here I`ve used Peter Blum)
ClientScriptLibrary
.RegisterEmbeddedResource(
typeof(AsyncUploader),
"jquery.ui.widget.js",
ClientDependencyType.Javascript);
ClientScriptLibrary
.RegisterEmbeddedResource(
typeof(AsyncUploader),
"jquery.fileupload.js",
ClientDependencyType.Javascript);
ClientScriptLibrary
.RegisterEmbeddedResource(
typeof(AsyncUploader),
"AsyncUploadFormItem.js",
ClientDependencyType.Javascript);
ClientScriptLibrary
.RegisterEmbeddedResource(
typeof(AsyncUploader),
"AsyncUploadFormItem.css",
ClientDependencyType.CSS);
base.OnLoad(e);
}
}
public class UploadedFile
{
public string OriginalFileName { get; set; }
public string SavedFileName { get; set; }
public string SavedFileFullPath { get; set; }
}
var setupfileUpload = function (options) {
var $filePicker = $(options.elementSelector + ' .filePicker');
var $uploadedFiles = $(options.elementSelector + ' .uploadedFiles');
var $uploadErrors = $(options.elementSelector + ' .uploadErrors');
$filePicker.fileupload({
url: 'fileUpload.axd',
dropZone: $filePicker,
add: function (e, data) {
cleanExceptionsPanel();
var count = $uploadedFiles.find('.sentFile').length;
//client validation by file size and file type
if (!isUploadLimit(options.maxFiles, count) ||
!isFileValid(data.files[0].size, data.files[0].name)) return;
//to support duplicated files the div id should be unique for different files - and that Id should be passed to the handler.
var divId = 'id' + (new Date()).getTime();
//submit the form with 2 additional parameters - where to save and file id
data.formData = { targetFolder: options.targetFolder, fileId: divId, sessionId: options.sessionId, controlid: options.uniqueID };
var jqXHR = data.submit();
//append a div containing inputs for a given file
var div = $('
');
div.append('
');
div.append('' + data.files[0].name + '');
var cancelButton = $('x');
cancelButton.on('click', function () { //delete uploaded file and/or cancel the upload process
cleanExceptionsPanel();
jqXHR.abort();
var savedFile = $(this).parent().find('.savedFile').val();
if (savedFile)
$.post('fileUpload.axd', { fileName: savedFile, deleteRequest: true, sessionId: options.sessionId, controlid: options.uniqueID });
$(this).parent().remove();
});
div.append(cancelButton);
div.append('');
$uploadedFiles.append(div);
},
done: function (e, data) {
var res = jQuery.parseJSON(data.result);
var div = $uploadedFiles.find('#' + res.FileId);
//remove the progress bar and insert a 'complete' dot instead
div.find('.progressbar').remove();
div.prepend('
');
var count = div.data('filenumber');
div.append('');
//manually hide the validation error
var validationError = $(options.elementSelector + ' span.errors .errorMessage');
validationError.css('visibility', 'hidden');
validationError.css('display', 'none');
},
progress: function (e, data) {
var p = parseInt(data.loaded / data.total * 100, 10);
if (typeof p === 'number') {
var div = $uploadedFiles.find('#' + data.formData.fileId);
var progress = div.find('.progress');
progress.css('width', p + '%');
}
},
fail: function (e, data) {
var div = $uploadedFiles.find('#' + data.formData.fileId);
//remove the progress bar and insert a 'fail' dot instead
div.find('.progressbar').remove();
div.prepend('
');
}
});
var cleanExceptionsPanel = function() {
$uploadErrors.html('');
}
var isUploadLimit = function(maxCount, currentCount) {
if (maxCount <= 0) return true;
if (maxCount == 1) { // replace existing file
$uploadedFiles.find('.deleteUploadedFile').each(function() { $(this).click(); });
}
else if (maxCount <= currentCount) {
$uploadErrors.append('Maximum number of files is attached');
return false;
}
return true;
};
var isFileValid = function (filesize, filename) {
if (filesize > options.maxFileSize) {
$uploadErrors.append('File is too large to be uploaded');
return false;
}
var pattern = '.+?\.(' + options.extensions + ')';
if (!filename.match(new RegExp(pattern, 'i'))) {
$uploadErrors.append('File type is not supported and cannot be uploaded');
return false;
}
return true;
};
var escapeFileName = function (fileName) {
return fileName.replace(/ |\.|#/g, '_');
}
if (options.submitButtonsSelector) {
$(options.submitButtonsSelector)
.prop('disabled', false)
.each(function() {
{
$(this).attr('title', $(this).attr('data-oldtitle'));
}
});
}
};
}
public class FileUploadHandler : IHttpHandler, IReadOnlySessionState
{
private static readonly Logger _log = LogManager.GetCurrentClassLogger();
private void DeleteFile(HttpContext context)
{
// Make sure to sanitise the filename by calling Path.GetFileName. This will
// prevent deletions from folders other than the target folder (which is known only
// by the server)
var filename = Path.GetFileName(context.Request.Form["filename"]);
var controlId = context.Request.Form["controlId"];
var path = context.Session[controlId + "_TargetFolder"].ToString();
var targetFilename = Path.Combine(path, filename);
if (File.Exists(targetFilename))
File.Delete(targetFilename);
}
private void UploadFile(HttpContext context)
{
Parse(context.Request.InputStream, Encoding.UTF8);
var controlId = context.Request.Form["controlId"];
var fileId = context.Request.Form["fileId"];
var path = context.Session[controlId + "_TargetFolder"].ToString();
var targetFolder = Directory.CreateDirectory(path).FullName;
var targetFilename = Path.Combine(targetFolder, _filename);
// Handle existing files by incrementing counter
int counter = 1;
while (File.Exists(targetFilename))
{
counter++;
targetFilename = Path.Combine(targetFolder,
Path.GetFileNameWithoutExtension(_filename) + counter + Path.GetExtension(_filename));
}
using (var file = File.Create(targetFilename))
{
file.Write(_fileContents, 0, _fileContents.Length);
}
context.Response.Write(new JavaScriptSerializer()
.Serialize(new
{
OriginalFile = _filename,
SavedFile = Path.GetFileName(targetFilename),
FileId = fileId
}));
}
public void ProcessRequest(HttpContext context)
{
var sessionId = context.Request.Form["sessionId"];
if (context.Session == null || context.Session.SessionID != sessionId)
throw new InvalidOperationException("Wrong session state during the file upload operation");
if (context.Request.Form["deleteRequest"] != null)
{
DeleteFile(context);
}
else
{
UploadFile(context);
}
}
private byte[] ToByteArray(Stream stream)
{
byte[] buffer = new byte[32768];
using (MemoryStream ms = new MemoryStream())
{
while (true)
{
int read = stream.Read(buffer, 0, buffer.Length);
if (read <= 0)
return ms.ToArray();
ms.Write(buffer, 0, read);
}
}
}
private int IndexOf(byte[] searchWithin, byte[] serachFor, int startIndex)
{
int index = 0;
int startPos = Array.IndexOf(searchWithin, serachFor[0], startIndex);
if (startPos != -1)
{
while ((startPos + index) < searchWithin.Length)
{
if (searchWithin[startPos + index] == serachFor[index])
{
index++;
if (index == serachFor.Length)
{
return startPos;
}
}
else
{
startPos = Array.IndexOf(searchWithin, serachFor[0], startPos + index);
if (startPos == -1)
{
return -1;
}
index = 0;
}
}
}
return -1;
}
private void Parse(Stream stream, Encoding encoding)
{
// Read the stream into a byte array
byte[] data = ToByteArray(stream);
// Copy to a string for header parsing
string content = encoding.GetString(data);
// The first line should contain the delimiter
int delimiterEndIndex = content.IndexOf("\r\n");
if (delimiterEndIndex > -1)
{
string delimiter = content.Substring(0, content.IndexOf("\r\n"));
// Look for Content-Type
Regex re = new Regex(@"(?<=Content\-Type:)(.*?)(?=\r\n\r\n)");
Match contentTypeMatch = re.Match(content);
// Look for filename
re = new Regex(@"(?<=filename\=\"")(.*?)(?=\"")");
Match filenameMatch = re.Match(content);
// Did we find the required values?
if (contentTypeMatch.Success && filenameMatch.Success)
{
// Set properties
this._contentType = contentTypeMatch.Value.Trim();
this._filename = filenameMatch.Value.Trim();
// Get the start & end indexes of the file contents
int startIndex = contentTypeMatch.Index + contentTypeMatch.Length + "\r\n\r\n".Length;
byte[] delimiterBytes = encoding.GetBytes("\r\n" + delimiter);
int endIndex = IndexOf(data, delimiterBytes, startIndex);
int contentLength = endIndex - startIndex;
// Extract the file contents from the byte array
byte[] fileData = new byte[contentLength];
Buffer.BlockCopy(data, startIndex, fileData, 0, contentLength);
this._fileContents = fileData;
}
}
}
private string _contentType;
public string _filename;
public byte[] _fileContents;
public bool IsReusable { get { return false; } }
}
Here I`ve used the Session to store the target folder for files as this is safer. Hope this will help someone :)
Yesterday I was pointed out that there is a security problem with my code. I was coding the async file uploader (will post it little later) and it is able to delete unused files as well. It is done with a post of a filename and that filename is stored in a hidden input. Everything is fine with that but the problem is in my usage of the Path.Combine method.
Here is a snippet:
var p1 = "C:\\Test";
var p2 = "C:\\NOT_A_TEST\\File.txt";
var p3 = "File.txt";
Console.WriteLine(Path.Combine(p1, p2)); //result is "C:\NOT_A_TEST\File.txt"
Console.WriteLine(Path.Combine(p1, p3)); //all good - C:\Test\File.txt
//and 2 safe methods:
Console.WriteLine(Path.Combine(p1, Path.GetFileName(p2)));
Console.WriteLine(Path.Combine(p1, Path.GetFileName(p3)));
So if you want to use path combine - make sure that the last part of it is only a filename, not the whole path as it can overwrite the whole result! And as aa side-note