Is it possibile to create a simple way to backup the event log, with such as a batch file or a simple app ?
I need to make it working on a customer's site, where the reference is an non-expert user.
Thanks
If you're using Windows 2008, use the built-in wevtutil command. Example:
wevtutil epl Application c:\temp\foo.evtx
Otherwise, get dumpel.exe from the resource kit, or psloglist from http://technet.microsoft.com/en-us/sysinternals/bb897544.aspx
With powershell and export-clixml its oneliner.
get-eventlog -list | %{ get-eventlog $_.Log | export-clixml -path ($_.Log + ".xml") }
The Microsoft Script Center has some sample code for Backing Up and Clearing Event Logs using VBScript and WMI.
Frank-Peter Schultze's Scripting Site has some code to clear an event log ( http://www.fpschultze.de/uploads/clrevt.vbs.txt) that you can modify to backup or backup then clear.
If you have access to the server you can backup from the Event Viewer by right-clicking on a log and using the "Save Log File As..." command. You can save to a binary, tab delimited or comma delimited file.
Finally I made a little winapp using this method found on the internet:
public void DoBackup(string sLogName)
{
string sBackup = sLogName; // could be for example "Application"
EventLog log = new EventLog();
log.Source = sBackup;
var query = from EventLogEntry entry in log.Entries
orderby entry.TimeGenerated descending
select entry;
string sBackupName = sBackup+"Log";
var xml = new XDocument(
new XElement(sBackupName,
from EventLogEntry entry in log.Entries
orderby entry.TimeGenerated descending
select new XElement("Log",
new XElement("Message", entry.Message),
new XElement("TimeGenerated", entry.TimeGenerated),
new XElement("Source", entry.Source),
new XElement("EntryType", entry.EntryType.ToString())
)
)
);
DateTime oggi = DateTime.Now;
string sToday = DateTime.Now.ToString("yyyyMMdd_hhmmss");
string path = String.Format("{0}_{1}.xml", sBackupName, sToday);
xml.Save(Path.Combine(Environment.CurrentDirectory, path));
}
this is the source link:
It simply works great!
Related
I have following C# code in a console application.
Whenever I debug the application and run the query1 (which inserts a new value into the database) and then run query2 (which displays all the entries in the database), I can see the new entry I inserted clearly. However, when I close the application and check the table in the database (in Visual Studio), it is gone. I have no idea why it is not saving.
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.SqlServerCe;
using System.Data;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
try
{
string fileName = "FlowerShop.sdf";
string fileLocation = "|DataDirectory|\\";
DatabaseAccess dbAccess = new DatabaseAccess();
dbAccess.Connect(fileName, fileLocation);
Console.WriteLine("Connected to the following database:\n"+fileLocation + fileName+"\n");
string query = "Insert into Products(Name, UnitPrice, UnitsInStock) values('NewItem', 500, 90)";
string res = dbAccess.ExecuteQuery(query);
Console.WriteLine(res);
string query2 = "Select * from Products";
string res2 = dbAccess.QueryData(query2);
Console.WriteLine(res2);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e);
Console.ReadLine();
}
}
}
class DatabaseAccess
{
private SqlCeConnection _connection;
public void Connect(string fileName, string fileLocation)
{
Connect(#"Data Source=" + fileLocation + fileName);
}
public void Connect(string connectionString)
{
_connection = new SqlCeConnection(connectionString);
}
public string QueryData(string query)
{
_connection.Open();
using (SqlCeDataAdapter da = new SqlCeDataAdapter(query, _connection))
using (DataSet ds = new DataSet("Data Set"))
{
da.Fill(ds);
_connection.Close();
return ds.Tables[0].ToReadableString(); // a extension method I created
}
}
public string ExecuteQuery(string query)
{
_connection.Open();
using (SqlCeCommand c = new SqlCeCommand(query, _connection))
{
int r = c.ExecuteNonQuery();
_connection.Close();
return r.ToString();
}
}
}
EDIT: Forgot to mention that I am using SQL Server Compact Edition 4 and VS2012 Express.
It is a quite common problem. You use the |DataDirectory| substitution string. This means that, while debugging your app in the Visual Studio environment, the database used by your application is located in the subfolder BIN\DEBUG folder (or x86 variant) of your project. And this works well as you don't have any kind of error connecting to the database and making update operations.
But then, you exit the debug session and you look at your database through the Visual Studio Server Explorer (or any other suitable tool). This window has a different connection string (probably pointing to the copy of your database in the project folder). You search your tables and you don't see the changes.
Then the problem get worse. You restart VS to go hunting for the bug in your app, but you have your database file listed between your project files and the property Copy to Output directory is set to Copy Always. At this point Visual Studio obliges and copies the original database file from the project folder to the output folder (BIN\DEBUG) and thus your previous changes are lost.
Now, your application inserts/updates again the target table, you again can't find any error in your code and restart the loop again until you decide to post or search on StackOverflow.
You could stop this problem by clicking on the database file listed in your Solution Explorer and changing the property Copy To Output Directory to Copy If Newer or Never Copy. Also you could update your connectionstring in the Server Explorer to look at the working copy of your database or create a second connection. The first one still points to the database in the project folder while the second one points to the database in the BIN\DEBUG folder. In this way you could keep the original database ready for deployment purposes and schema changes, while, with the second connection you could look at the effective results of your coding efforts.
EDIT Special warning for MS-Access database users. The simple act of looking at your table changes the modified date of your database ALSO if you don't write or change anything. So the flag Copy if Newer kicks in and the database file is copied to the output directory. With Access better use Copy Never.
Committing changes / saving changes across debug sessions is a familiar topic in SQL CE forums. It is something that trips up quite a few people. I'll post links to source articles below, but I wanted to paste the answer that seems to get the best results to the most people:
You have several options to change this behavior. If your sdf file is part of the content of your project, this will affect how data is persisted. Remember that when you debug, all output of your project (including the sdf) if in the bin/debug folder.
You can decide not to include the sdf file as part of your project and manage the file location runtime.
If you are using "copy if newer", and project changes you make to the database will overwrite any runtime/debug changes.
If you are using "Do not copy", you will have to specify the location in code (as two levels above where your program is running).
If you have "Copy always", any changes made during runtime will always be overwritten
Answer Source
Here is a link to some further discussion and how to documentation.
I'm working on integrating SQL Server databases into our in-house version control/deployment utility, which is built with powershell,and uses Github as a repository.
Using the excellent sqlpackage.exe utility, I have been able to add a process whereby a developer can extract their current changes into a dacpac and store it in Github, then do the opposite in reverse when they want to get the latest version. However, because the .dacpac is a binary file, it's not possible to see differences in git. I have mitigated this somewhat by unzipping the dacpac before storing in in source control, so contained xml files are added instead. However, even though these files are text-based, they are still not easy to look through and find differences.
What I would like to do, is convert the dacpac into a folder structure similar to what would be seen in SSMS (with all the database objects such as triggers, sprocs etc in their respective folders), store that in Github, and then convert it back into a dacpac when a client checks out the code. However, there doesn't seem to be any function in sqlpackage.exe for this, and I can't find any documentation. Is there any command line tool I can use to this through Powershell?
Using the public APIs for DacFx you can load the dacpac, iterate over all objects, and script each one out. If you're willing to write your own code you could write each one to its own file based on the object type. The basic process is covered in the model filtering samples in the DacExtensions Github project. Specifically you'll want to do something like the ModelFilterer code that loads a dacpac, queries all objects, scripts them out - see the CreateFilteredModel method. I've put a sample that should mostly work below. Once you have this, you can easily do compare on a per-object basis.
using (TSqlModel model = new TSqlModel(dacpacPath))
{
IEnumerable<TSqlObject> allObjects = model.GetObjects(QueryScopes);
foreach (TSqlObject tsqlObject allObjects)
{
string script;
if (tsqlObject.TryGetScript(out script))
{
// Some objects such as the DatabaseOptions can't be scripted out.
// Write to disk by object type
string objectTypeName = tsqlObject.ObjectType.Name;
// pseudo-code as I didn't bother writing.
// basically just create the folder and write a file
this.MkdirIfNotExists(objectTypeName);
this.WriteToFile(objectTypeName, tsqlObject.Name + '.sql', script);
}
}
}
This can be converted into a powershell cmdlet fairly easily. The dacfx libraries are on nuget at https://www.nuget.org/packages/Microsoft.SqlServer.DacFx.x64/ so you should be able to install them in PS and then use the code without too much trouble.
Based on the other post I was able to get a script working. Caveat is you'll have to try the types till you get what you want... The way it is no it trys to put the full http or https value for some of the objects.
param($dacpacPath = 'c:\somepath' Debug', $dacpac = 'your.dacpac')
Add-Type -Path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll'
add-type -path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.Extensions.dll'
cd $dacpacPath
$dacPack = [Microsoft.SqlServer.Dac.DacPackage]::Load(((get-item ".\$dacpac").fullname))
$model =[Microsoft.SqlServer.Dac.Model.TSqlModel]::new(((get-item ".\$dacpac").fullname))
$queryScopes = [Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All
$return = [System.Collections.Generic.IEnumerable[string]]
$returnObjects = $model.GetObjects([Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All)
$s = ''
foreach($r in $returnObjects)
{
if ($r.TryGetScript([ref]$s))
{
$objectTypeName = $r.ObjectType.Name;
$d="c:\temp\db\$objectTypeName"
if(!(test-path $d ))
{
new-item $d -ItemType Directory
}
$filename = "$d\$($r.Name.Parts).sql"
if(! (test-path $filename))
{new-item $filename -ItemType File}
$s | out-file $filename -Force
write-output $filename
}
}
i have created the data base in windows phone 7 and it works me fine
after rebuild the application it says that
The database file cannot be found. Check the path to the database. [ Data Source = \Applications\Data\6157CB94-31D3-4E6F-BFC3-78BE1549C10A\Data\IsolatedStore\amit.sdf ]
my code for db string is
` private const string Con_String = #"isostore:/amit.sdf";`
how to solve this pls give me any suggestion to solve this problem
Have you checked this sample How to create a basic local database app for Windows Phone?
they use this path for create the db
//Specify the connection string as a static, used in main page and app.xaml.
public static string DBConnectionString = "Data Source=isostore:/ToDo.sdf";
and also don't forget to check if the db exists
//Create the database if it does not exist.
using (ToDoDataContext db = new ToDoDataContext(ToDoDataContext.DBConnectionString))
{
if (db.DatabaseExists() == false)
{
//Create the database
db.CreateDatabase();
}
}
I am in the process of building a website that in near future shall replace an existing web-site. The existing web-site contains aprox. 300 users which I need to "import" to the extranet-module (which I have bought) in composite.
Is there a way to batch create users to the extranet module?
Yes, you can import your existing user database. You can either do this by writing a script and have that execute on your web site or by directly manipulating the underlying SQL table / XML file (depending on what you use to store Composite C1 data). You can also build a provider that links your existing user database with Composite C1 Extranet.
Importing users programmatically: For a script approach please see methods like AddNewUser described on http://docs.composite.net/Packages/Community/Extranet/CompositeCommunityExtranetDeveloperGuide/Using-Extranet-Facade-Methods
You would write this script as web service, aspx page or similar which executes on the Composite C1 website.
If you are running the Extranet in a default setup expect the providerName to be "Default".
Manipulating the physical data store directly: This depends on what data store you are running on. I suggest you add the groups you want and a test user to help you recognize data when you look at the underlying XML files / SQL tables.
If you are running on XML (default) you should focus on the files named Composite.Community.Extranet.DefaultProvider.DataTypes.DefaultProvider*.xml located in the folder ~/App_Data/Composite/DataStores. There are 3 sush files, one for groups, one for users and one for the relation between users and groups.
If you are running on SQL Server you should focus on the 3 tables named Composite.Community.Extranet.DefaultProvider.DataTypes.DefaultProvider*
In both cases you would need to add new entries to the User table/xml file and matching group relations to the GroupUser table/xml file. When you add a user you provide a unique ID and this ID you reuse to register the user in GroupUser.
When you have made your changes you can force Composite C1 to reload by using the Tools | Restart Server command in the C1 Console. If you make a backup of files/tables before you make changes you can easily revert by restoring the backup (in case you need to start over).
Writing a user/group provider: If your user data is in an external store and you would like to keep it there you could also make a bridge between this existing user store and the Composite C1 Extranet by creating a custom provider. If this is relevant see http://docs.composite.net/Packages/Community/Extranet/CompositeCommunityExtranetDeveloperGuide/Writing-Custom-Extranet-Providers
Thank you. It now works. I imported the users programmatically. I opened the composite solution in visual studio and added a aspx page. Here is the code behind.
using System;
using System.Collections.Generic;
using System.IO;
using Composite.Community.Extranet;
using Composite.Community.Extranet.Data;
public partial class ImportUsers : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
string path = Server.MapPath("UsersToImport.csv");
StreamReader _streamReader = new StreamReader(path);
IList<Guid> userIds = new List<Guid>();
string line;
while ((line = _streamReader.ReadLine()) != null)
{
string[] fields = line.Split(',');
IExtranetUser extranetUser = new ExtranetUser();
extranetUser.Name = fields[0];
extranetUser.UserName = fields[1];
extranetUser.Email = fields[2];
extranetUser.IsApproved = true;
IExtranetUser addedUser = ExtranetFacade.AddNewUser("Default", extranetUser);
userIds.Add(addedUser.Id);
ExtranetFacade.SetUsersForGroup("Default", new Guid("bc728100-e28e-4135-a14c-bead6e0b9b00"), userIds);
Response.Write(string.Format("User: {0} added at {1}", addedUser.UserName, addedUser.CreationDate));
}
}
}
I've parsed these files in regular C# applications, but the IO methods for the files are different in Silverlight, and I can't seem to find the right methods. Searches haven't turned up any information I can use. For the real application I'll be receiving XML from the server, but for the prototype I just need to parse a file with some sample data in it.
You can save the Excel file as XML. An example can be found in this link
This way you can keep your import procedure the same and process the data as when you go live.
To access files from the user's machine you are required to use the OpenFileDialog and SaveFileDialog. Without elevated trust (requires out of browser apps) you will not be able to know anything more than the filename the user selected for input/saving; you will have no idea what the path is to this file. This function can only be called as the result of a user taking an action such as clicking a button; otherwise it will fail because Silverlight does not want malicious code prompting a user with annoying dialogs automatically.
To do this you would do something as follows:
var openFile = new OpenFileDialog();
if ( open.ShowDialog() == true ) // Sadly this is a nullable bool so this is necessary
{
using( Stream myStream = openFile.File.OpenRead() )
using ( var reader = new StreamReader( myStream ))
{
...
}
}