Nagios: Monitor Windows certificates

 

Plugin’s job si to monitor certificates in Windows store expiration date.

I am using nsclient++ on servers to execute PowerShell scripts & co. The flow is:

Nagios => check_nrpe =>  powershell script => nagios feedback

This works also on Server Core.

Windows contains many already expired certificate. This script contains a blacklist to ignore them.

 

tested setup

Linux:

  • Centos 6.4 x64
  • Nagios 3.4.4
  • check_nrpe 2.13
  • Centreon 2.4.2

Windows:

  • Windows Server 2003 / 2008 R2 / 2012
  • nsclient++ 0.4.1 x64 et x86
  • Servers Core & GUI

Script arguments

  • checkMyStore (on by default)
  • checkRootStore (on by default)
  • checkCAStore (on by default)
  • checkAuthRootStore (on by default)
  • checkSharePointStore (on by default)
  • expireInDays (60 days by default)
  • maxWarn (warning if above)
  • maxCrit (Critical if above)

For each store, argument must be a boolean ($true /$false ou 1/0)

expireInDays,maxWarn and maxCrit must be integers

Sample usages

Directly in PowerShell:

PS C:Program FilesNSClient++scripts> . .lotp_check_certificates.ps1
CRITICAL: www.lotp.fr:2013/06/30
PS C:Program FilesNSClient++scripts>

Through NRPE:

[root~]# /usr/lib64/nagios/plugins/check_nrpe -H myserver -n -c check_certificate -a $true $true $true $true $true 60 0 0

CRITICAL: www.lotp.fr:2013/06/30

[root~]#

Install:

On Windows:

  • Enable powershell script execution without signed : Set-ExecutionPolicy RemoteSigned
  • copy script in folder C:Program FilesNSClient++scripts
  • Add to nsclient.ini:
    • [/settings/external scripts/wrapped scripts]
      check_certificate=lotp_check_certificate.ps1 $ARG1$ $ARG2$ $ARG3$ $ARG4$ $ARG5$ $ARG6$ $ARG7$ $ARG8$

Setup:

On Centreon, by adding this command:

$USER1$/check_nrpe -H $HOSTADDRESS$ -n -c check_certificate -a $ARG1$ $ARG2$ $ARG3$ $ARG4$ $ARG5$ $ARG6$ $ARG7$ $ARG8$

Download

(remove .txt at the end)

lotp_check_certificates.ps1

Code source here:

 

# ====================================================================
# Check certificates health state
# Author: Mathieu Chateau - LOTP
# mail: mathieu.chateau@lotp.fr
# version 0.1
# ====================================================================

#
# Require Set-ExecutionPolicy RemoteSigned.. or sign this script with your PKI 
#

# ============================================================
#
#  Do not change anything behind that line!
#
param 
(
	[bool]$checkMyStore=$true,
	[bool]$checkRootStore=$true,
	[bool]$checkCAStore=$true,
	[bool]$checkAuthRootStore=$true,
	[bool]$checkSharePointStore=$true,
	[int]$expireInDays=60,
	[int]$maxWarn = 1,
	[int]$maxError = 0

)

# blacklist all third party known expired certificates in root & co, on Windows Server 2003, 2008 & 2012
$blacklist=@(
"109F1CAED645BB78B3EA2B94C0697C740733031C",
"12519AE9CD777A560184F1FBD54215222E95E71F",
"127633A94F39CBF6EDF7C7BF64C4B535E9706E9A",
"18F7C1FCC3090203FD5BAA2F861A754976C8DD25",
"23EF3384E21F70F034C467D4CBA6EB61429F174E",
"245C97DF7514E7CF2DF8BE72AE957B9E04741E85",
"24A40A1F573643A67F0A4B0749F6A22BF28ABB6B",
"24BA6D6C8A5B5837A48DB5FAE919EA675C94D217",
"2B84BFBB34EE2EF949FE1CBE30AA026416EB2216",
"3A850044D8A195CD401A680C012CB0A3B5F8DC08",
"4463C531D7CCC1006794612BB656D3BF8257846F",
"47AFB915CDA26D82467B97FA42914468726138DD",
"4BA7B9DDD68788E12FF852E1A024204BF286A8F6",
"4D8547B7F864132A7F62D9B75B068521F10B68E3",
"4DF13947493CFF69CDE554881C5F114E97C3D03B",
"4EF2E6670AC9B5091FE06BE0E5483EAAD6BA32D9",
"4F65566336DB6598581D584A596C87934D5F2AB4",
"51C3247D60F356C7CA3BAF4C3F429DAC93EE7B74",
"53DECDF3BC1BDE7C9D1CEDAE718468CA20CC43E7",
"587B59FB52D8A683CBE1CA00E6393D7BB923BC92",
"5E997CA5945AAB75FFD14804A974BF2AE1DFE7E1",
"637162CC59A3A1E25956FA5FA8F60D2E1C52EAC6",
"6690C02B922CBD3FF0D0A5994DBD336592887E3F",
"67EB337B684CEB0EC2B0760AB488278CDD9597DD",
"687EC17E0602E3CD3F7DFBD7E28D57A0199A3F44",
"688B6EB807E8EDA5C7B17C4393D0795F0FAE155F",
"68ED18B309CD5291C0D3357C1D1141BF883866B1",
"720FC15DDC27D456D098FABF3CDD78D31EF5A8DA",
"7613BF0BA261006CAC3ED2DDBEF343425357F18B",
"7A74410FB0CD5C972A364B71BF031D88A6510E9E",
"7AC5FFF8DCBC5583176877073BF751735E9BD358",
"7B02312BACC59EC388FEAE12FD277F6A9FB4FAC1",
"7CA04FD8064C1CAA32A37AA94375038E8DF8DDC0",
"7D7F4414CCEF168ADF6BF40753B5BECD78375931",
"7F88CD7223F3C813818C994614A89C99FA3B5247",
"838E30F77FDD14AA385ED145009C0E2236494FAA",
"8977E8569D2A633AF01D0394851681CE122683A6",
"8B24CD8D8B58C6DA72ACE097C7B1E3CEA4DC3DC6",
"9078C5A28F9A4325C2A7C73813CDFE13C20F934E",
"90DEDE9E4C4E9F6FD88617579DD391BC65A68964",
"96974CD6B663A7184526B1D648AD815CF51E801A",
"9845A431D51959CAF225322B4A4FE9F223CE6D15",
"9BACF3B664EAC5A17BED08437C72E4ACDA12F7E7",
"9FC796E8F8524F863AE1496D381242105F1B78F5",
"A1505D9843C826DD67ED4EA5209804BDBB0DF502",
"A399F76F0CBF4C9DA55E4AC24E8960984B2905B6",
"A3E31E20B2E46A328520472D0CDE9523E7260C6D",
"A5EC73D48C34FCBEF1005AEB85843524BBFAB727",
"B19DD096DCD4E3E0FD676885505A672C438D4E9C",
"B533345D06F64516403C00DA03187D3BFEF59156",
"B6AF5BE5F878A00114C3D7FEF8C775C34CCD17B6",
"B72FFF92D2CE43DE0A8D4C548C503726A81E2B93",
"CFDEFE102FDA05BBE4C78D2E4423589005B2571D",
"D29F6C98BEFC6D986521543EE8BE56CEBC288CF3",
"DBAC3C7AA4254DA1AA5CAAD68468CB88EEDDEEA8",
"E38A2B7663B86796436D8DF5898D9FAA6835B238",
"EC0C3716EA9EDFADD35DFBD55608E60A05D3CBF3",
"EF2DACCBEABB682D32CE4ABD6CB90025236C07BC",
"F5A874F3987EB0A9961A564B669A9050F770308A",
"F88015D3F98479E1DA553D24FD42BA3F43886AEF")

$output=""
$outputNames=""
$countMyStore=0
$countRootStore=0
$countCAStore=0
$countAuthRootStore=0
$countSharePointStore=0
$countTotal=0

$allCerts=Get-ChildItem -Path cert: -Recurse | ? {
($_.Notafter -lt (get-date).AddDays($expireInDays)) -and 
($_.PSPParentPath -notmatch "Disallowed") -and
($blacklist -notcontains $_.Thumbprint)} | select NotAfter,FriendlyName,PSParentPath

function outputCert ($temp)
{
	$outputTemp=""
	foreach ($t in $temp)
	{
		$outputTemp+=$t.FriendlyName+":"+(get-date -Date $t.NotAfter -format "yyyy/MM/dd")+" "
	}
	return $outputTemp
}
# check params if provided

if($checkMyStore)
{
	$temp=@($allCerts | ? {$_.PSParentPath -match "\My$"})
	$countMyStore=$temp.Count
	if($temp.Count -gt 0)
	{
		$outputNames+=outputCert $temp
	}
}
if($checkRootStore)
{
	$temp=@($allCerts | ? {$_.PSParentPath -match "\Root$"})
	$countRootStore=$temp.Count
	if($temp.Count -gt 0)
	{
		$outputNames+=outputCert $temp
	}
}
if($checkCAStore)
{
	$temp=@($allCerts | ? {$_.PSParentPath -match "\CA$"})
	$countCAStore=$temp.Count
	if($temp.Count -gt 0)
	{
		$outputNames+=outputCert $temp
	}
}
if($checkAuthRootStore)
{
	$temp=@($allCerts | ? {$_.PSParentPath -match "\AuthRoot$"})
	$countAuthRootStore=$temp.Count
	if($temp.Count -gt 0)
	{
		$outputNames+=outputCert $temp
	}
}
if($checkSharePointStore)
{
	$temp=@($allCerts | ? {$_.PSParentPath -match "\SharePoint$"})
	$countSharePointStore=$temp.Count
	if($temp.Count -gt 0)
	{
		$outputNames+=outputCert $temp
	}
}

foreach ($var in (Get-Variable -Name "count*Store"))
{
	$countTotal+=$($var).Value
}

if($countTotal -gt $maxError)
{
	$state="CRITICAL"
	$exitcode=2
}
elseif($countTotal -gt $maxWarn)
{
	$state="WARNING"
	$exitcode=1
}
else
{
	$state="OK"
	$exitcode=0
}
$output=$state+": "+$outputNames

Write-Host $output
exit $exitcode

Nagios: monitor Active Directory accounts

Check for Active Directory Accounts using powershell through NRPE / nsclient++:

  • Account Disabled
  • Account Expired
  • Account Expiring
  • Account Inactive
  • Locked Out
  • Password Expired
  • Password Never Expires

I am using nsclient++ on servers to execute PowerShell scripts & co. The flow is:

Nagios => check_nrpe =>  powershell script => nagios feedback

I am using the standard ActiveDirectory PowerShell module. This works also on Server Core.

tested setup

Linux:

  • Centos 6.4 x64
  • Nagios 3.4.4
  • check_nrpe 2.13
  • Centreon 2.4.2

Active Directory:

  • Windows Server 2008 R2 / Windows Server 2012
  • nsclient++ 0.4.1 x64
  • Core & GUI Servers

Script args

  • action (LockedOut by default)
  • searchBase (Whole domain by default)
  • seachScope (subtree by default)
  • maxWarn (warning if above)
  • maxCrit (Critical if above)

action can be:
AccountDisabled,AccountExpired,AccountExpiring,AccountInactive,LockedOut,PasswordExpired,PasswordNeverExpires
LockedOut if omitted

searchBase can be:
dc=mydomain,dc=com / ou=my users,dc=mydomain,dc=com
whole domain if omitted

seachScope can be:
Base,OneLevel,Subtree
Subtree if omitted

maxWarn and maxCrit but me integer

Usage samples

Directly from PowerShell:

PS C:Program FilesNSClient++scripts> . .lotp_check_ad_accounts.ps1 AccountInactive "dc=mydomain,dc=com" subtree 5 10
CRITICAL: 216 AccountInactive|216;5;10
PS C:Program FilesNSClient++scripts>

Through NRPE:

[root~]# /usr/lib64/nagios/plugins/check_nrpe -H prd-dom-dc01 -n -c check_ad_account -a AccountInactive "dc=pmside,dc=net" subtree 5 10

CRITICAL: 216 AccountInactive|'AccountInactive'=216;5;10

[root~]#

Install:

On DC:

  • Enable powershell script execution without signed : Set-ExecutionPolicy RemoteSigned
  • copy script in folder C:Program FilesNSClient++scripts
  • Add to nsclient.ini:
    • [/settings/external scripts/wrapped scripts]
      check_ad_account=lotp_check_ad_accounts.ps1 $ARG1$ $ARG2$ $ARG3$ $ARG4$ $ARG5$

Setup:

For example on Centreon, adding this command:

$USER1$/check_nrpe -H $HOSTADDRESS$ -n -c check_ad_account -a $ARG1$ "$ARG2$" $ARG3$ $ARG4$ $ARG5$

Download

(remove .txt at the end)

lotp_check_ad_accounts.ps1

Directly in case download fail:

# ====================================================================
# Search in AD for lockedout account. To be used through NRPE / nsclient++
# Author: Mathieu Chateau - LOTP
# mail: mathieu.chateau@lotp.fr
# version 0.1
# ====================================================================#
# Require Set-ExecutionPolicy RemoteSigned.. or sign this script with your PKI 
#
# ============================================================
#
#  Do not change anything behind that line!
#
param 
(
    [string]$action="LockedOut",
    [string]$searchBase="",
    [string]$searchScope="Subtree",
    [int]$maxWarn=5,
    [int]$maxCrit=10
)

# check that powershell ActiveDirectory module is present

if(Get-Module -Name "ActiveDirectory" -ListAvailable)
{
    try
    {
        Import-Module -Name ActiveDirectory
    }
    catch
    {
        Write-Host "CRITICAL: Missing PowerShell ActiveDirectory module"
        exit 2
    }
}
else
{
    Write-Host "CRITICAL: Missing PowerShell ActiveDirectory module"
    exit 2
}

# check params if provided

if($action -notmatch "^(AccountDisabled|AccountExpired|AccountExpiring|AccountInactive|LockedOut|PasswordExpired|PasswordNeverExpires)$")
{
    Write-Host "CRITICAL: action parameter can only be AccountDisabled,AccountExpired,AccountExpiring,AccountInactive,LockedOut,PasswordExpired,PasswordNeverExpires. Provided $action"
    exit 2
}
if($searchScope -notmatch "^(Base|OneLevel|Subtree)$")
{
    Write-Host"CRITICAL: searchScope parameter can only be Base,OneLevel,Subtree. Provided $searchScope"
    exit 2
}
if(($searchBase -ne "") -and $searchBase -ne ((Get-ADDomain).DistinguishedName))
{
    $search=Get-ADObject -Filter 'ObjectClass -eq "OrganizationalUnit" -and DistinguishedName -eq $searchBase'

if ($search.Count -ne 1)
    {
        Write-Host"CRITICAL: SearchBase not found or duplicate. Provided $searchBase"
        exit 2
    }
}
else
{
    $searchBase=(Get-ADDomain).DistinguishedName
}

$command="Search-ADAccount -"+$action+" -SearchBase '"+$searchBase+"' -SearchScope "+$searchScope

$result=invoke-expression $command

if($result.Count -gt $maxCrit)
{
    $state="CRITICAL"
    $exitcode=2
}
elseif($result.Count -gt $maxWarn)
{
    $state="WARNING"
    $exitcode=1
}
else
{
    $state="OK"
    $exitcode=0
}

$output=$state+": "+$result.Count+""+$action+"|"+$action+"="+$result.Count+";"+$maxWarn+";"+$maxCrit

Write-Host $output
exit $exitcode

IT++: Manage better or buy stronger ?

 

Computing, as with many things, ends up being cramped in the existing system. Maybe a little faster in computing than in other areas. Arises always a choice between better manage existing versus invest in a new solution / make an upgrade.

 

Better manage existing

This choice is more courageous than the second, but also more risky. It means that you think you can do better than what has been done since the beginning. This is generally characterized by time to spend with a gain difficult to estimate in advance. I think we should try it first because:

  • It will potentially generate savings, even if it is insufficient and that we should still invest.
  • It will better understand the need for reviewing the uses, and thus justify the investment potential.
  • It shows that you not only “invest”.

The main thing is to set a goal in terms of time and load to produce a result. It should, however, never exceed a certain % that would cost the latter.

 

Demand in computing resource inexorably raise in business. The most strategic projects are generally subject to a “capacity planning” to ensure that the solution will last the famous 3 or 4 years of depreciation. There are some poor parents who rarely benefit from this treatment:

  • Storage of office files,
  • Mail Storage,
  • Network usage (inter sites, and Internet).

ramer-desert

Credit

Requesting household in the office files is like rowing in the desert. Everyone claims to have better things to do, but nobody wants to pay the price that it costs (central storage €MC / N€tApp, backup …). To stop this bleeding, miracle solutions have emerged (3 tiers archiving, deduplication, SharePoint …). This last allows indexing, which is almost worse. How to find where in a shambles without clean his room. Not only users do not want to delete old files, but they do not want to classify either…

Fortunately, we can transform the virus in a vaccine: searching words salaries and bonuses. Guaranteed Results!

The network is part of the heavy investment that work per stages. Storage and backup are also in. Solutions exist for quite some time, because it was the first point of contention:

  • QOS: Manage the pipe: ensure flow, restricting others
  • Compression: Riverbed & co. Hope that the data are redundant and do the equivalent of a “zip” on network streams.

In response to this, I propose two approaches in parallel:

  • Ensure that best “minimum” practices  are applied
  • Equip IT to be able to do chargeback.

 

Some good proven practices:

  • http / https flows are compressed by Web servers and proxy
  • Replications (DFS, SQL …) between sites are made during peak hours or with integrated bandwidth management,
  • Favor sending Delta rather than complete
  • Search the largest files,
  • Block from the start rather than a posteriori (media files …)
  • Set quotas to manage unplanned growth. Even if blocking is not really possible.
  • Note any “temporary” solution, identifying the applicant, the reason and the date of removal,
  • Put safety (warning / blocking) below the actually blocking values ​.
  • After putting into production, revalidate the initial capacity planning.

When needs are for a specific project, it is often easy to identify the pattern of costs. This is more difficult when it comes to the Internet or storage. Chargeback tools measure  consumption. Even if the chargeback is not done, it clearly identifies the consumer, and ventilate the cost of the next upgrade.

 

Invest

This solution is certainly (or so) an immediate response to a problem or need. On some issues, such as office files, it cannot attract the wrath of users, especially when they do not hesitate to compare the price of a 1TB disk from the dealer on the corner. However there are cases where this choice does not bring the expected benefits. This is particularly the case with performance issues, where to buy a second server does not necessarily mean twice faster.

 

The investment is often favored because it also helps to have resources to carry out the actions. If you want to optimize your virtual infrastructure, you may be struggling to get a budget at most for an audit. So if you do a project with new servers and upgrade, you will be given the budget for it, with the needed resources. This is due to the difficulty of displaying earnings before optimization.

Conclusion

I recommend the following actions for the IT++ “label”:

  • Have key indicators of saturation. These should be sufficient to have the time to conduct an optimization phase. Otherwise we find ourselves back to the wall and the investment will be systematically used.
  • Request exercise to quantify the consumption of resources in projects. Use upgrade to request it on existing project. Check after the difference between the expected and the actual. The figures are as interesting as the awareness of people about the impact of their project.
  • When a solution to a consumption problem is identified in a project (enable http compression), include it as the default for all new projects. Ask the existing one to check if it applies too.
  • Implement chargeback tools on shared items and where the consumer is not clearly identifiable.
  • Verify that consumption graphs are actually available on the key elements of architecture: storage, network, processor, memory. It is not when there is a saturation that these graphs must be implemented.
  • Reduce the price of central storage GB. This allows easy recognition when applications. Ditto for the network.
  • Challenging choices in the past renewals architecture. We made some choices based on:
    • context
    • The state of technology (maturity, cost, knowledge)
    • budget.

Sometimes it is even others who have made these choices on the current architecture. It less engaging “just renew”, but it locks you indirectly limited choices for the future.

Project 2013: Upgrade-SPProjectWebInstance – ActivatePWAWebThemesFeature failed

Trying to upgrade a Project instance from 2010 to 2013:

Upgrade-SPProjectWebInstance https://url/pwa

I got the following error:

Upgrade-SPProjectWebInstance : Post provision setup failed.
ActivatePWAWebThemesFeature failed.
At line:1 char:1
+ Upgrade-SPProjectWebInstance https://url/pwa
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 + CategoryInfo : InvalidData: (Microsoft.Offic...radePwaInstance:
 PSCmdletUpgradePwaInstance) [Upgrade-SPProjectWebInstance], ProvisionException
 + FullyQualifiedErrorId : Microsoft.Office.Project.Server.Cmdlet.PSCmdletUpgradePwaInstance

 

Solution:

You need to firstly upgrade the SharePoint site to 20103 to get the feature available::

Upgrade-SPSite -Identity https://url/pwa -versionupgrade

Then start again the pwa upgrade:

Upgrade-SPProjectWebInstance https://url/pwa

Time management in IT

In IT, as in all professionals works, we often feel to “miss” time. Weeks after weeks, we may imagine ourself rowing in desert.

That’s why, in february last year, i opened an account on RescueTime. For the 10 monthes of 2012, it shows up some perspective:

 

That’s 2418 hous in total, or 6.6h per day in 7/7 …. Another thing, we easily see the holidays (holes).

Their software screen directly used software and get it’s own database to classify them. Even installed on my smartphone, to get the big picture.

Their classification is good with few errors, which can be corrected directly.

The increasing interest in digging by type of activity, for example, I’m more development in late:

 

By software:

 

We can even compare ourself to others using their software:

 

 

This solution does not allow me to spare time or significantly increase my productivity… But at least I can measure it and know how it happened, which is a first step 🙂

 

 

System Center Orchestrator 2012 – PowerShell script – ForegroundColor definition exception

En migrant un script PowerShell sous Orchestrator 2012, j’ai obtenue le message d’erreur suivant:

Exception lors de la définition de « ForegroundColor » : « Impossible de convertir la valeur Null en type « System.ConsoleColor » en raison de valeurs d'énumération non valides. Spécifiez l'une des valeurs d'énumération suivantes et réessayez. Les valeurs d'énumération possibles sont « Black, DarkBlue, DarkGreen, DarkCyan, DarkRed, DarkMagenta, DarkYellow, Gray, DarkGray, Blue, Green, Cyan, Red, Magenta, Yellow, White ». »

Le script commençait pas un classique “cls”. Cette commande n’est pas compatible dans une exécution de script PowerShell avec Orchestrator.

De toute façon on ne voit pas les sorties écrans et donc ça ne sert pas 🙂

Un message d’erreur plus propre aurait toutefois été apprécié.

TechDays – jour 1

Si vous êtes aux TechDays, n’hésitez pas à venir voir le stand System Center / Windows Server 🙂

J’y étais hier et j’y suis aujourd’hui (mercredi) toute la journée.

C’est juste en face du stand ENI 😉

Record à battre: 154 scan de badges hier !

Audi goodies: working around autorun

Ayant une voiture Audi, j’ai reçu un carton de la marque m’incitant à regarder leur offre d’assistance. Pour cela, une clé USB est intégré au carton:

Marketing Audi

Une fois insérée dans l’ordi sous Windows 7, je vois une installation de pilote Plug & Play  Jusque là, rien d’anormal. Après avoir attendu quelques minutes, toujours pas de proposition d’autorun, et pas de lecteur dans le poste de travail.

Un petit tour dans le gestionnaire de disque, aucune trace de clé USB.

Coup classique, je débranche et rebranche la clé, “pour voir”. Et là surprise ! Je vois une fenêtre Démarrer/exécuter apparaître, et une URL saisie comme au clavier, sans rien demander:

 

 

Il s’agissait donc d’un clavier pré programmé et non d’une clé USB de stockage.

Malin car cela évite d’accepter l’autorun, et contourne surement les blocages de clé usb en entreprise, vu que c’est un clavier..

Cependant, cela prend directement le focus, et saisi au clavier. Si démarrer / exécuter n’est pas autoriser, cela va saisir du texte dans le vide.

SCOM 2012 : De l’installation à l’exploitation, mise en oeuvre et bonnes pratiques

Un peu d’auto promo sur un livre que j’ai écris et qui sort pour noël 🙂

Il porte sur System Center Operations Manager (SCOM) 2012, depuis l’installation jusqu’à l’exploitation des alertes.
Il couvre à la fois les aspects techniques (règles vs moniteurs) mais aussi la manière de gérer et d’exploiter la solution.

Il répond aux besoins d’expertise du lecteur en traitant de façon approfondie, d’un point de vue théorique et pratique, de l’architecture à la supervision applicative, en passant par ledéploiement, les rapports, la création de packs et l’exploitation. L’aspect technique n’étant qu’une partie d’un projet SCOM, l’auteur présente également une méthode pour gérer le flux d’alertes, enrichie de ses retours d’expérience.

Aperçu de la table des matières :

Introduction (Pourquoi SCOM, Le mode de fonctionnement, Le modèle de licence, SCOM vs Nagios, Nouveautés depuis SCOM 2007…)

Architecture SCOM (Introduction, Composants SCOM, Capacity planning, Consommation réseau, Gestion de la sécurité, Haute disponibilité…)

Déploiement(Introduction, Installation initiale, Mise à jour depuis SCOM 2007, Accéder à la plateforme, Configuration initiale, Gestion des droits…)

Mise sous supervision (Introduction, Déploiement agent Windows, Compte restreint, Déploiement agent Linux, Supervision SNMP, Supervision Web…)

Exploitation de la supervision (Introduction, Présentation des consoles, Règle vs moniteur, Explorateur d’intégrité, Effet sapin de noël, Temps de maintenance, Bonnes pratiques…)

Les rapports (Introduction, Utilisation des rapports standards, La base entrepôt, Report Builder, Business Intelligence Development Studio, Suivi du niveau de service…)

Les packs d’administration (Introduction, Importation de packs, Remplacements, Packs de la communauté, Objets de packs, Création de packs…)

Supervision applicative (Introduction, Applications distribuées, Applications Web .Net, Applications Java J2E…).

 http://www.editions-eni.fr/livres/scom-2012-system-center-operations-manager-de-l-installation-a-l-exploitation-mise-en-oeuvre-et-bonnes-pratiques/.2f80e3011dd5984d7046b876f446d463.html

Si vous n’avez pas encore envoyé votre lettre au père noël, il est encore temps ! 😉