Output data from database in JSF page - database

I am making a project using JSF, and I know how to get data from my view. I also know how to get data with the JDBC connector. And also how to put data in the view, from some objects, but my question is:
How to put data directly from my database, for example a list of person, in JSF, for example with the tag <h:outputText value="#{}"/> ?
I have found some examples with instantiate objects, but I did not found a real example with data from a DB.

JSF is just an MVC framework to develop web applications in Java. JSF doesn't associate with any data source at all. The only data JSF will use is retrieved from:
The data already stored in the proper object as attribute: HttpServletRequest, HttpSession or ServletContext.
The request/view/session/application context in form of fields in the managed beans, recognized by classes decorated as #ManagedBeans or #Named if using CDI. The data of these fields will be stored as attributes in the objects mentioned in the section above, depending on the scope of the managed bean.
By knowing this, then the only thing you should worry about is to fill the fields in your managed beans. You can fill them with incoming data from database, from a web service or whatever data source you have in mind.
For example, if you want/need to populate your data to pre process a request, you can do the following:
#ManagedBean
#ViewScoped
public class SomeBean {
List<Entity> entityList;
#PostConstruct
public void init() {
SomeService someService = new SomeService();
entityList = someService.findEntityList();
}
//getters and setters for the list...
}
//as you can see, this class is just pure Java
//you may use other frameworks if you want/need
public class SomeService {
public List<Entity> findEntityList() {
String sql = "SELECT field1, field2... FROM table";
List<Entity> entityList = new ArrayList<>();
try (Connection con = ...; //retrieve your connection somehow
PreparedStatement pstmt = con.prepareStatement(sql)) {
ResultSet rs = pstmt.executeQuery();
while (rs.next()) {
Entity entity = new Entity();
entity.setField1(rs.getString("field1"));
entity.setField2(rs.getString("field2"));
//...
entityList.add(entity);
}
} catch (Exception e) {
//handle exception ...
e.printStackTrace();
}
return entityList;
}
}

Related

Spring Batch "Invalid object name BATCH_JOB_INSTANCE"

I've created a spring batch to query a Azure SQL server database and write the data into a CSV file. I do not have create permissions for the database. I get this error Invalid Object name BATCH_JOB_INSTANCE on running the batch. I don't want the spring batch meta-data tables to be created in the main database. Or it would be helpful if I can have them in another local or in-memory db like h2db.
I've also added spring-batch-initialize-schema=never already, which was the case with most answers to similar questions on here, but that didn't help.
Edit:
I resolved the Invalid Object name error by preventing the metadata tables from being created into the main database by extending the DefaultBatchConfigurer Class and Overriding the setDataSource method, thus having them created in the in-memory map-repository. Now I want to try two options:
How to have the meta data tables to be created in a local db or in-memory db like h2db.
Or If I have the meta data tables created already in the main database, in a different schema than my main table I'm fetching from. How to point my job to those meta-data tables in another schema, to store the job and step details data in those.
#Configuration
public class SpringBatchConfig extends DefaultBatchConfigurer{
#Override
public void setDataSource(DataSource datasource) {
}
...
My application.properties file looks like this:
spring.datasource.url=
spring.datasource.username=
spring.datasource.password=
spring.datasource.driver-class-name=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring-batch-initialize-schema=never
spring.batch.job.enabled=false
spring.jpa.hibernate.ddl-auto=update
spring.jpa.show-sql=true
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
I've created a demo with two datasources. Batch metadata will sotre in H2 DB and the Job datasource is Azure SQL.
Here is the project structure:
We need define a DataSourceConfig class and use #Primary annotation for DataSource bean:
#Configuration
public class DataSourceConfig {
#Bean(name = "mssqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource appDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name = "h2DataSource")
#Primary
// #ConfigurationProperties(prefix="spring.datasource.h2")
public DataSource h2DataSource() {
return DataSourceBuilder.create()
.url("jdbc:h2:mem:thing:H2;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.driverClassName("org.h2.Driver")
.username("sa")
.password("")
.build();
}
}
In the ItemReaderDbDemo class, we use #Autowired #Qualifier("mssqlDataSource") to specify the dataSource in the Spring Batch task:
#Configuration
public class ItemReaderDbDemo {
//generate task Object
#Autowired
private JobBuilderFactory jobBuilderFactory;
//Step exec tasks
//generate step Object
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("mssqlDataSource")
private DataSource dataSource;
#Autowired
#Qualifier("dbJdbcWriter")
private ItemWriter<? super Todo> dbJdbcWriter;
#Bean
public Job itemReaderDbDemoJob() {
return jobBuilderFactory.get("itemReaderDbDemoJob").start(itemReaderDbStep()).build();
}
#Bean
public Step itemReaderDbStep() {
return stepBuilderFactory.get("itemReaderDbStep")
.<Todo,Todo>chunk(2)
.reader(dbJdbcReader())
.writer(dbJdbcWriter)
.build();
}
#Bean
#StepScope
public JdbcPagingItemReader<Todo> dbJdbcReader() {
JdbcPagingItemReader<Todo> reader = new JdbcPagingItemReader<Todo>();
reader.setDataSource(dataSource);
reader.setFetchSize(2);
reader.setRowMapper(new RowMapper<Todo>() {
#Override
public Todo mapRow(ResultSet rs, int rowNum) throws SQLException {
Todo todo = new Todo();
todo.setId(rs.getLong(1));
todo.setDescription(rs.getString(2));
todo.setDetails(rs.getString(3));
return todo;
}
});
SqlServerPagingQueryProvider provider = new SqlServerPagingQueryProvider();
provider.setSelectClause("id,description,details");
provider.setFromClause("from dbo.todo");
//sort
Map<String,Order> sort = new HashMap<>(1);
sort.put("id", Order.DESCENDING);
provider.setSortKeys(sort);
reader.setQueryProvider(provider);
return reader;
}
}
Here is my application.properties:
logging.level.org.springframework.jdbc.core=DEBUG
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.jdbcUrl=jdbc:sqlserver://josephserver2.database.windows.net:1433;database=<Your-Database-Name>;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;
spring.datasource.username=<Your-UserName>
spring.datasource.password=<Your-Password>
spring.datasource.initialization-mode=always
It return expected result from my Azure SQL. By the way, my Azure sql username does not have create permissions for the database.
The result shows:
How to have the meta data tables to be created in a local db or in-memory db like h2db.
You can use spring.batch.initialize-schema=embedded for that.
Or If I have the meta data tables created already in the main database, in a different schema than my main table I'm fetching from. How to point my job to those meta-data tables in another schema, to store the job and step details data in those.
spring batch works against a datasource, not a particular schema. If meta-data tables are in a different schema, then you need to create a second datasource pointing to that schema and set it on the job repository.
I know this post is a little bit old, but I'd like to give an update.
For newer versions of Spring Boot spring.batch.initialize-schema is deprecated.
I'm using Spring Boot 2.7.1 and the newer property is spring.batch.jdbc.initialize-schema.
In my case: when I was receiving the error message was due that the user did not have the CREATE TABLE permission to create the corresponding spring bacth tables.
Adding the permissions fix the issue.

Initialize Spring embedded database after deployment

I have an Spring MVC app with an embedded database (HSQLDB) that I want to initialize after deployment. I know that I could use an xml script to define initial data for my datasource but, as long I'm using JPA + Hibernate, I would like to use Java code. Is there a way to do this?
Heavily updated answer (it was too complex before):
All you need is to add initializing bean to your context, which will insert all the necessary data into the database:
public class MockDataPopulator {
private static boolean populated = false;
#Autowired
private SessionFactory sessionFactory;
#PostConstruct
public void populateDatabase() {
// Prevent duplicate initialization as HSQL is also initialized only once. Duplicate executions
// can happen when the application context is reloaded - e.g. when running unit tests).
if (populated) {
return;
}
// Create new persistence session
Session session = sessionFactory.openSession();
session.setFlushMode(FlushMode.ALWAYS);
// Insert mock entities
session.merge(MockDataFactory.createMyFirstObject())
session.merge(MockDataFactory.createMySeconfObject())
// ...
// Flush and close
session.flush();
session.close();
// Set initialization flag
populated = true;
}
}

One To One Relationship in JPA (AppEngine)

In my Profile class I have
#OneToOne(cascade=CascadeType.PERSIST)
private ProfilePicture profilePic = null;
My method in updating the profilePic
public Profile updateUserProfilePic(Profile user) {
EntityManager em = EMF.get().createEntityManager();
em.getTransaction().begin();
Profile userx = em.find(Profile.class, user.getEmailAddress());
userx.setProfilePic( user.getProfilePic() );
em.getTransaction().commit();
em.close();
return userx;
}
When updateUserProfilePic is called, it just add another profilePic in datastore, it doesn't replaced the existing profilePic. Is my implementation correct? I want to update the profilePic of profile.
"Transient" means not persistent and not detached.
Using that version of GAE JPA you need a detached or managed object there if you want it to reuse the existing object.
Using v2 of Googles plugin there is a persistence property that allows merge of a transient object that has "id" fields set.

Logback dbAppender Custom SQL

Is there a way to change the tables that logback writes its data to using the dbAppender, It has three default tables that must be created before using dbAppender, but I want to customise it to write to one table of my choosing. Something similar to Log4J where I can specify the SQL that gets executed when inserting the log to the database.
Tomasz, maybe I'm missing something but I don't see how just using custom DBNameResolver could be the answer to what Magezy asked. DBNameResolver is used by DBAppender via SQLBuilder to construct 3 SQL insert querys - via DBNameResolve one can only affect names of tables and columns where data will be inserted, but can not limit inserting to just one table, not to mention that by just implementing DBNameResolver there are no means to control what actually gets inserted.
To match log4j's JDBCAppender IMO one has to extend logback's DBAppender, or DBAppenderBase, or maybe even implement completely new custom Appender.
The easiest way for me was to make an appender from scratch. I'm appending to a single table, using Spring JDBC. It works something like this:
public class MyAppender extends AppenderBase<ILoggingEvent>
{
private String _jndiLocation;
private JDBCTemplate _jt;
public void setJndiLocation(String jndiLocation)
{
_jndiLocation = jndiLocation;
}
#Override
public void start()
{
super.start();
if (_jndiLocation == null)
{
throw new IllegalStateException("Must have the JNDI location");
}
DataSource ds;
Context ctx;
try
{
ctx = new InitialContext();
Object obj = ctx.lookup(_jndiLocation);
ds= (DataSource) obj;
if (ds == null)
{
throw new IllegalStateException("Failed to obtain data source");
}
_jt = new JDBCTemplate(ds);
}
catch (Exception ex)
{
throw new IllegalStateException("Unable to obtain data source", ex);
}
}
#Override
protected void append(ILoggingEvent e)
{
// log to database here using my JDBCTemplate instance
}
}
I ran into trouble with SLF4J - the substitute logger error described here:
http://www.slf4j.org/codes.html#substituteLogger
This thread on multi-step configuration enabled me to work around that issue.
You need to implement ch.qos.logback.classic.db.names.DBNameResolver and use it in the configuration:
<appender name="DB" class="ch.qos.logback.classic.db.DBAppender">
<dbNameResolver class="com.example.MyDBNameResolver"/>
<!-- ... -->
</appender>
<appender name="CUSTOM_DB_APPENDER" class="com.....MyDbAppender">
<filter class="com......MyFilter"/>
<param name="jndiLocation" value="java:/comp/env/jdbc/....MyPath"/>
</appender>
And your java MyDbAppender should have a string jndiLocation with setter.
Now do a jndi lookup (see the solution answered in Oct 17 '11 at 16:03)

Database per tenant with Fluent NHibernate & StructureMap

I'm currently using StructureMap to inject an NHibernateRegistry instance into my DAL, which configures NHibernate for a single connection string and bootstraps a Singleton FluentConfiguration for my single-user app.
How should I modify my Fluent NHibernate configuration to use a different database based on a {tenant} routing parameter in my routing URL?
Routing example:
{tenant}/{controller}/{action}/{id}
...where requests for branch1/Home/Index and branch2/Home/Index use the same application code, but different databases to retrieve the data displayed.
I solved this problem in the past for StructureMap and LINQ by injecting a per-request TenantContext object, which retrieved the routing parameter from the HttpContext it accepted as a constructor parameter and specified a different LINQ data context.
However, I suspect NHibernate has a better of handling this than I could cook up.
Partial NHibernateRegistry class
public class NHibernateRegistry : Registry
{
// ... private vars here
public NHibernateRegistry()
{
var cfg = Fluently.Configure()
.Database(MsSqlConfiguration
.MsSql2008.ConnectionString(c =>
c.FromConnectionStringWithKey("TenantConnectionStringKey")))
// where to inject this key?
.ExposeConfiguration(BuildSchema)
.Mappings(x =>
x.FluentMappings.AddFromAssembly(typeof(UserMap).Assembly)
For<FluentConfiguration>().Singleton().Use(cfg);
var sessionFactory = cfg.BuildSessionFactory();
For<ISessionFactory>().Singleton()
.Use(sessionFactory);
For<ISession>().HybridHttpOrThreadLocalScoped()
.Use(x => x.GetInstance<ISessionFactory>().OpenSession());
For<IUnitOfWork>().HybridHttpOrThreadLocalScoped()
.Use<UnitOfWork>();
For<IDatabaseBuilder>().Use<DatabaseBuilder>();
}
}
StructureMap configuration:
public static class Bootstrapper
{
public static void ConfigureStructureMap()
{
ObjectFactory.Initialize(Init);
}
private static void Init(IInitializationExpression x)
{
x.AddRegistry(new NHibernateRegistry()); // from Data project
}
}
I'm new to NHibernate, so I am unsure of scoping my sessions and configurations. Does NHibernate have a built-in way to handle this?
This worked for me in an a module
return Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2008.ConnectionString(x => x.FromConnectionStringWithKey("IMB"))
.Cache(c => c.UseQueryCache().QueryCacheFactory<StandardQueryCacheFactory>()
.RegionPrefix("IMB")
.ProviderClass<HashtableCacheProvider>()
.UseMinimalPuts()).UseReflectionOptimizer())
.Mappings(m => m.FluentMappings.AddFromAssembly(Assembly.Load("IMB.Data")))
.Mappings(m => m.FluentMappings.AddFromAssembly(Assembly.Load("IMB.Security")))
.ExposeConfiguration(
c => c.SetProperty("current_session_context_class", "web"))
.ExposeConfiguration(cfg => _configuration = cfg)
.BuildSessionFactory();
The problem is that you really want your ISessionFactory object to be a singleton. This means its best not to specify the connection string when creating the ISessionFactory. Have your tried creating the ISessionFactory without specifying a connection string and then passing a manually created connection to ISessionFactory.OpenSession?
For example:
public ISession CreateSession()
{
string tennantId = GetTennantId();
string connStr = ConnectionStringFromTennant(tennantId);
SqlConnection conn = new SqlConnection(connStr);
conn.Open();
session = sessionFactory.OpenSession(conn);
}
And then tell StructureMap to call this method.
The downside is that you can't now from build the database schema when creating the ISessionFactory, but maybe creating database schemas in web applications isn't that great an idea anyway?

Resources